Raspbian Package Auto-Building

Build log for consul (1.5.2+dfsg1-6) on armhf

consul1.5.2+dfsg1-6armhf → 2019-12-30 19:15:13

sbuild (Debian sbuild) 0.71.0 (24 Aug 2016) on bm-wb-04

+==============================================================================+
| consul 1.5.2+dfsg1-6 (armhf)                 Mon, 30 Dec 2019 17:59:14 +0000 |
+==============================================================================+

Package: consul
Version: 1.5.2+dfsg1-6
Source Version: 1.5.2+dfsg1-6
Distribution: bullseye-staging
Machine Architecture: armhf
Host Architecture: armhf
Build Architecture: armhf

I: NOTICE: Log filtering will replace 'var/lib/schroot/mount/bullseye-staging-armhf-sbuild-8db272d1-67b2-4e85-b2ff-2e9782482bc8' with '<<CHROOT>>'

+------------------------------------------------------------------------------+
| Update chroot                                                                |
+------------------------------------------------------------------------------+

Get:1 http://172.17.0.1/private bullseye-staging InRelease [11.3 kB]
Get:2 http://172.17.0.1/private bullseye-staging/main Sources [11.5 MB]
Get:3 http://172.17.0.1/private bullseye-staging/main armhf Packages [12.7 MB]
Fetched 24.2 MB in 28s (864 kB/s)
Reading package lists...
W: No sandbox user '_apt' on the system, can not drop privileges

+------------------------------------------------------------------------------+
| Fetch source files                                                           |
+------------------------------------------------------------------------------+


Check APT
---------

Checking available source versions...

Download source files with APT
------------------------------

Reading package lists...
NOTICE: 'consul' packaging is maintained in the 'Git' version control system at:
https://salsa.debian.org/go-team/packages/consul.git
Please use:
git clone https://salsa.debian.org/go-team/packages/consul.git
to retrieve the latest (possibly unreleased) updates to the package.
Need to get 5409 kB of source archives.
Get:1 http://172.17.0.1/private bullseye-staging/main consul 1.5.2+dfsg1-6 (dsc) [5436 B]
Get:2 http://172.17.0.1/private bullseye-staging/main consul 1.5.2+dfsg1-6 (tar) [5383 kB]
Get:3 http://172.17.0.1/private bullseye-staging/main consul 1.5.2+dfsg1-6 (diff) [21.0 kB]
Fetched 5409 kB in 2s (2773 kB/s)
Download complete and in download only mode
I: NOTICE: Log filtering will replace 'build/consul-SHy3m5/consul-1.5.2+dfsg1' with '<<PKGBUILDDIR>>'
I: NOTICE: Log filtering will replace 'build/consul-SHy3m5' with '<<BUILDDIR>>'

+------------------------------------------------------------------------------+
| Install build-essential                                                      |
+------------------------------------------------------------------------------+


Setup apt archive
-----------------

Merged Build-Depends: build-essential, fakeroot
Filtered Build-Depends: build-essential, fakeroot
dpkg-deb: building package 'sbuild-build-depends-core-dummy' in '/<<BUILDDIR>>/resolver-aykkKG/apt_archive/sbuild-build-depends-core-dummy.deb'.
dpkg-scanpackages: warning: Packages in archive but missing from override file:
dpkg-scanpackages: warning:   sbuild-build-depends-core-dummy
dpkg-scanpackages: info: Wrote 1 entries to output Packages file.
gpg: keybox '/<<BUILDDIR>>/resolver-aykkKG/gpg/pubring.kbx' created
gpg: /<<BUILDDIR>>/resolver-aykkKG/gpg/trustdb.gpg: trustdb created
gpg: key 35506D9A48F77B2E: public key "Sbuild Signer (Sbuild Build Dependency Archive Key) <buildd-tools-devel@lists.alioth.debian.org>" imported
gpg: Total number processed: 1
gpg:               imported: 1
gpg: key 35506D9A48F77B2E: "Sbuild Signer (Sbuild Build Dependency Archive Key) <buildd-tools-devel@lists.alioth.debian.org>" not changed
gpg: key 35506D9A48F77B2E: secret key imported
gpg: Total number processed: 1
gpg:              unchanged: 1
gpg:       secret keys read: 1
gpg:   secret keys imported: 1
gpg: using "Sbuild Signer" as default secret key for signing
Ign:1 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ InRelease
Get:2 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Release [957 B]
Get:3 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Release.gpg [370 B]
Get:4 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Sources [349 B]
Get:5 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Packages [433 B]
Fetched 2109 B in 1s (2755 B/s)
Reading package lists...
W: No sandbox user '_apt' on the system, can not drop privileges
Reading package lists...

Install core build dependencies (apt-based resolver)
----------------------------------------------------

Installing build dependencies
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
  libpam-cap netbase
Use 'apt autoremove' to remove them.
The following NEW packages will be installed:
  sbuild-build-depends-core-dummy
0 upgraded, 1 newly installed, 0 to remove and 29 not upgraded.
Need to get 852 B of archives.
After this operation, 0 B of additional disk space will be used.
Get:1 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ sbuild-build-depends-core-dummy 0.invalid.0 [852 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 852 B in 0s (15.8 kB/s)
Selecting previously unselected package sbuild-build-depends-core-dummy.
(Reading database ... 12199 files and directories currently installed.)
Preparing to unpack .../sbuild-build-depends-core-dummy_0.invalid.0_armhf.deb ...
Unpacking sbuild-build-depends-core-dummy (0.invalid.0) ...
Setting up sbuild-build-depends-core-dummy (0.invalid.0) ...
W: No sandbox user '_apt' on the system, can not drop privileges

+------------------------------------------------------------------------------+
| Check architectures                                                          |
+------------------------------------------------------------------------------+

Arch check ok (armhf included in any all)

+------------------------------------------------------------------------------+
| Install package build dependencies                                           |
+------------------------------------------------------------------------------+


Setup apt archive
-----------------

Merged Build-Depends: debhelper (>= 11~), bash-completion, dh-golang (>= 1.42~), golang-any (>= 2:1.13~), golang-github-asaskevich-govalidator-dev, golang-github-armon-circbuf-dev, golang-github-armon-go-metrics-dev (>= 0.0~git20171117~), golang-github-armon-go-radix-dev, golang-github-azure-go-autorest-dev (>= 10.15.5~), golang-github-bgentry-speakeasy-dev, golang-github-circonus-labs-circonus-gometrics-dev (>= 2.3.1~), golang-github-circonus-labs-circonusllhist-dev, golang-github-datadog-datadog-go-dev, golang-github-davecgh-go-spew-dev, golang-github-denverdino-aliyungo-dev, golang-github-digitalocean-godo-dev, golang-github-docker-go-connections-dev, golang-github-elazarl-go-bindata-assetfs-dev (>= 0.0~git20151224~), golang-github-ghodss-yaml-dev, golang-github-gogo-googleapis-dev, golang-github-gogo-protobuf-dev (>= 1.2.1~), golang-github-golang-snappy-dev, golang-github-googleapis-gnostic-dev, golang-github-google-gofuzz-dev, golang-github-gophercloud-gophercloud-dev, golang-github-gregjones-httpcache-dev, golang-github-hashicorp-go-checkpoint-dev, golang-github-hashicorp-go-cleanhttp-dev (>= 0.5.1~), golang-github-hashicorp-go-discover-dev, golang-github-hashicorp-go-hclog-dev (>= 0.9.2~), golang-github-hashicorp-go-immutable-radix-dev (>= 1.1.0~), golang-github-hashicorp-golang-lru-dev (>= 0.0~git20160207~), golang-github-hashicorp-go-memdb-dev (>= 0.0~git20180224~), golang-github-hashicorp-go-msgpack-dev (>= 0.5.5~), golang-github-hashicorp-go-multierror-dev, golang-github-hashicorp-go-raftchunking-dev, golang-github-hashicorp-go-reap-dev, golang-github-hashicorp-go-retryablehttp-dev, golang-github-hashicorp-go-rootcerts-dev, golang-github-hashicorp-go-sockaddr-dev, golang-github-hashicorp-go-syslog-dev, golang-github-hashicorp-go-uuid-dev, golang-github-hashicorp-go-version-dev, golang-github-hashicorp-hcl-dev, golang-github-hashicorp-hil-dev (>= 0.0~git20160711~), golang-github-hashicorp-logutils-dev, golang-github-hashicorp-memberlist-dev (>= 0.1.5~), golang-github-hashicorp-net-rpc-msgpackrpc-dev, golang-github-hashicorp-raft-boltdb-dev, golang-github-hashicorp-raft-dev (>= 1.1.1~), golang-github-hashicorp-scada-client-dev, golang-github-hashicorp-serf-dev (>= 0.8.4~), golang-github-hashicorp-yamux-dev (>= 0.0~git20151129~), golang-github-inconshreveable-muxado-dev, golang-github-imdario-mergo-dev, golang-github-jefferai-jsonx-dev, golang-github-json-iterator-go-dev, golang-github-kr-text-dev, golang-github-mattn-go-isatty-dev, golang-github-miekg-dns-dev, golang-github-mitchellh-cli-dev (>= 1.0.0~), golang-github-mitchellh-go-testing-interface-dev, golang-github-mitchellh-copystructure-dev, golang-github-mitchellh-hashstructure-dev, golang-github-mitchellh-mapstructure-dev, golang-github-mitchellh-reflectwalk-dev, golang-github-nytimes-gziphandler-dev, golang-github-packethost-packngo-dev, golang-github-pascaldekloe-goe-dev, golang-github-peterbourgon-diskv-dev, golang-github-pmezard-go-difflib-dev, golang-github-ryanuber-columnize-dev, golang-github-ryanuber-go-glob-dev, golang-github-shirou-gopsutil-dev, golang-github-spf13-pflag-dev, golang-golang-x-sys-dev (>= 0.0~git20161012~), golang-gopkg-inf.v0-dev, golang-gopkg-square-go-jose.v2-dev, mockery, golang-github-sap-go-hdb-dev
Filtered Build-Depends: debhelper (>= 11~), bash-completion, dh-golang (>= 1.42~), golang-any (>= 2:1.13~), golang-github-asaskevich-govalidator-dev, golang-github-armon-circbuf-dev, golang-github-armon-go-metrics-dev (>= 0.0~git20171117~), golang-github-armon-go-radix-dev, golang-github-azure-go-autorest-dev (>= 10.15.5~), golang-github-bgentry-speakeasy-dev, golang-github-circonus-labs-circonus-gometrics-dev (>= 2.3.1~), golang-github-circonus-labs-circonusllhist-dev, golang-github-datadog-datadog-go-dev, golang-github-davecgh-go-spew-dev, golang-github-denverdino-aliyungo-dev, golang-github-digitalocean-godo-dev, golang-github-docker-go-connections-dev, golang-github-elazarl-go-bindata-assetfs-dev (>= 0.0~git20151224~), golang-github-ghodss-yaml-dev, golang-github-gogo-googleapis-dev, golang-github-gogo-protobuf-dev (>= 1.2.1~), golang-github-golang-snappy-dev, golang-github-googleapis-gnostic-dev, golang-github-google-gofuzz-dev, golang-github-gophercloud-gophercloud-dev, golang-github-gregjones-httpcache-dev, golang-github-hashicorp-go-checkpoint-dev, golang-github-hashicorp-go-cleanhttp-dev (>= 0.5.1~), golang-github-hashicorp-go-discover-dev, golang-github-hashicorp-go-hclog-dev (>= 0.9.2~), golang-github-hashicorp-go-immutable-radix-dev (>= 1.1.0~), golang-github-hashicorp-golang-lru-dev (>= 0.0~git20160207~), golang-github-hashicorp-go-memdb-dev (>= 0.0~git20180224~), golang-github-hashicorp-go-msgpack-dev (>= 0.5.5~), golang-github-hashicorp-go-multierror-dev, golang-github-hashicorp-go-raftchunking-dev, golang-github-hashicorp-go-reap-dev, golang-github-hashicorp-go-retryablehttp-dev, golang-github-hashicorp-go-rootcerts-dev, golang-github-hashicorp-go-sockaddr-dev, golang-github-hashicorp-go-syslog-dev, golang-github-hashicorp-go-uuid-dev, golang-github-hashicorp-go-version-dev, golang-github-hashicorp-hcl-dev, golang-github-hashicorp-hil-dev (>= 0.0~git20160711~), golang-github-hashicorp-logutils-dev, golang-github-hashicorp-memberlist-dev (>= 0.1.5~), golang-github-hashicorp-net-rpc-msgpackrpc-dev, golang-github-hashicorp-raft-boltdb-dev, golang-github-hashicorp-raft-dev (>= 1.1.1~), golang-github-hashicorp-scada-client-dev, golang-github-hashicorp-serf-dev (>= 0.8.4~), golang-github-hashicorp-yamux-dev (>= 0.0~git20151129~), golang-github-inconshreveable-muxado-dev, golang-github-imdario-mergo-dev, golang-github-jefferai-jsonx-dev, golang-github-json-iterator-go-dev, golang-github-kr-text-dev, golang-github-mattn-go-isatty-dev, golang-github-miekg-dns-dev, golang-github-mitchellh-cli-dev (>= 1.0.0~), golang-github-mitchellh-go-testing-interface-dev, golang-github-mitchellh-copystructure-dev, golang-github-mitchellh-hashstructure-dev, golang-github-mitchellh-mapstructure-dev, golang-github-mitchellh-reflectwalk-dev, golang-github-nytimes-gziphandler-dev, golang-github-packethost-packngo-dev, golang-github-pascaldekloe-goe-dev, golang-github-peterbourgon-diskv-dev, golang-github-pmezard-go-difflib-dev, golang-github-ryanuber-columnize-dev, golang-github-ryanuber-go-glob-dev, golang-github-shirou-gopsutil-dev, golang-github-spf13-pflag-dev, golang-golang-x-sys-dev (>= 0.0~git20161012~), golang-gopkg-inf.v0-dev, golang-gopkg-square-go-jose.v2-dev, mockery, golang-github-sap-go-hdb-dev
dpkg-deb: building package 'sbuild-build-depends-consul-dummy' in '/<<BUILDDIR>>/resolver-aykkKG/apt_archive/sbuild-build-depends-consul-dummy.deb'.
dpkg-scanpackages: warning: Packages in archive but missing from override file:
dpkg-scanpackages: warning:   sbuild-build-depends-consul-dummy sbuild-build-depends-core-dummy
dpkg-scanpackages: info: Wrote 2 entries to output Packages file.
gpg: using "Sbuild Signer" as default secret key for signing
Ign:1 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ InRelease
Get:2 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Release [969 B]
Get:3 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Release.gpg [370 B]
Get:4 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Sources [1369 B]
Get:5 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ Packages [1458 B]
Fetched 4166 B in 1s (5295 B/s)
Reading package lists...
W: No sandbox user '_apt' on the system, can not drop privileges
Reading package lists...

Install consul build dependencies (apt-based resolver)
------------------------------------------------------

Installing build dependencies
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
  libpam-cap netbase
Use 'apt autoremove' to remove them.
The following additional packages will be installed:
  autoconf automake autopoint autotools-dev bash-completion bsdmainutils
  ca-certificates debhelper dh-autoreconf dh-golang dh-strip-nondeterminism
  dwz file gettext gettext-base gogoprotobuf golang-1.13-go golang-1.13-src
  golang-any golang-dbus-dev golang-ginkgo-dev
  golang-github-alecthomas-units-dev golang-github-armon-circbuf-dev
  golang-github-armon-go-metrics-dev golang-github-armon-go-radix-dev
  golang-github-asaskevich-govalidator-dev golang-github-aws-aws-sdk-go-dev
  golang-github-azure-go-autorest-dev golang-github-beorn7-perks-dev
  golang-github-bgentry-speakeasy-dev golang-github-boltdb-bolt-dev
  golang-github-bradfitz-gomemcache-dev golang-github-cespare-xxhash-dev
  golang-github-circonus-labs-circonus-gometrics-dev
  golang-github-circonus-labs-circonusllhist-dev
  golang-github-coreos-go-systemd-dev golang-github-coreos-pkg-dev
  golang-github-cyphar-filepath-securejoin-dev
  golang-github-datadog-datadog-go-dev golang-github-davecgh-go-spew-dev
  golang-github-denverdino-aliyungo-dev golang-github-dgrijalva-jwt-go-dev
  golang-github-dgrijalva-jwt-go-v3-dev golang-github-digitalocean-godo-dev
  golang-github-dimchansky-utfbom-dev golang-github-docker-go-connections-dev
  golang-github-docker-go-units-dev golang-github-docopt-docopt-go-dev
  golang-github-elazarl-go-bindata-assetfs-dev golang-github-fatih-color-dev
  golang-github-garyburd-redigo-dev golang-github-ghodss-yaml-dev
  golang-github-go-ini-ini-dev golang-github-go-kit-kit-dev
  golang-github-go-logfmt-logfmt-dev golang-github-go-stack-stack-dev
  golang-github-go-test-deep-dev golang-github-gogo-googleapis-dev
  golang-github-gogo-protobuf-dev golang-github-golang-mock-dev
  golang-github-golang-snappy-dev golang-github-google-btree-dev
  golang-github-google-go-cmp-dev golang-github-google-go-querystring-dev
  golang-github-google-gofuzz-dev golang-github-googleapis-gnostic-dev
  golang-github-gophercloud-gophercloud-dev
  golang-github-gregjones-httpcache-dev golang-github-hashicorp-errwrap-dev
  golang-github-hashicorp-go-checkpoint-dev
  golang-github-hashicorp-go-cleanhttp-dev
  golang-github-hashicorp-go-discover-dev golang-github-hashicorp-go-hclog-dev
  golang-github-hashicorp-go-immutable-radix-dev
  golang-github-hashicorp-go-memdb-dev golang-github-hashicorp-go-msgpack-dev
  golang-github-hashicorp-go-multierror-dev
  golang-github-hashicorp-go-raftchunking-dev
  golang-github-hashicorp-go-reap-dev
  golang-github-hashicorp-go-retryablehttp-dev
  golang-github-hashicorp-go-rootcerts-dev
  golang-github-hashicorp-go-sockaddr-dev
  golang-github-hashicorp-go-syslog-dev golang-github-hashicorp-go-uuid-dev
  golang-github-hashicorp-go-version-dev
  golang-github-hashicorp-golang-lru-dev golang-github-hashicorp-hcl-dev
  golang-github-hashicorp-hil-dev golang-github-hashicorp-logutils-dev
  golang-github-hashicorp-mdns-dev golang-github-hashicorp-memberlist-dev
  golang-github-hashicorp-net-rpc-msgpackrpc-dev
  golang-github-hashicorp-raft-boltdb-dev golang-github-hashicorp-raft-dev
  golang-github-hashicorp-scada-client-dev golang-github-hashicorp-serf-dev
  golang-github-hashicorp-yamux-dev golang-github-imdario-mergo-dev
  golang-github-inconshreveable-muxado-dev golang-github-jeffail-gabs-dev
  golang-github-jefferai-jsonx-dev golang-github-jmespath-go-jmespath-dev
  golang-github-jpillora-backoff-dev golang-github-json-iterator-go-dev
  golang-github-julienschmidt-httprouter-dev golang-github-kr-pretty-dev
  golang-github-kr-pty-dev golang-github-kr-text-dev
  golang-github-mattn-go-colorable-dev golang-github-mattn-go-isatty-dev
  golang-github-miekg-dns-dev golang-github-mitchellh-cli-dev
  golang-github-mitchellh-copystructure-dev
  golang-github-mitchellh-go-homedir-dev
  golang-github-mitchellh-go-testing-interface-dev
  golang-github-mitchellh-hashstructure-dev
  golang-github-mitchellh-mapstructure-dev
  golang-github-mitchellh-reflectwalk-dev
  golang-github-modern-go-concurrent-dev golang-github-modern-go-reflect2-dev
  golang-github-mwitkow-go-conntrack-dev golang-github-nytimes-gziphandler-dev
  golang-github-opencontainers-runc-dev
  golang-github-opencontainers-selinux-dev
  golang-github-opencontainers-specs-dev
  golang-github-opentracing-opentracing-go-dev
  golang-github-packethost-packngo-dev golang-github-pascaldekloe-goe-dev
  golang-github-peterbourgon-diskv-dev golang-github-pkg-errors-dev
  golang-github-pmezard-go-difflib-dev golang-github-posener-complete-dev
  golang-github-prometheus-client-golang-dev
  golang-github-prometheus-client-model-dev
  golang-github-prometheus-common-dev golang-github-ryanuber-columnize-dev
  golang-github-ryanuber-go-glob-dev golang-github-sap-go-hdb-dev
  golang-github-seccomp-libseccomp-golang-dev
  golang-github-shirou-gopsutil-dev golang-github-sirupsen-logrus-dev
  golang-github-spf13-pflag-dev golang-github-stretchr-objx-dev
  golang-github-stretchr-testify-dev golang-github-syndtr-goleveldb-dev
  golang-github-tent-http-link-go-dev golang-github-tv42-httpunix-dev
  golang-github-ugorji-go-codec-dev golang-github-ugorji-go-msgpack-dev
  golang-github-urfave-cli-dev golang-github-vishvananda-netlink-dev
  golang-github-vishvananda-netns-dev golang-github-vmihailenco-tagparser-dev
  golang-github-vmware-govmomi-dev golang-github-xeipuuv-gojsonpointer-dev
  golang-github-xeipuuv-gojsonreference-dev
  golang-github-xeipuuv-gojsonschema-dev golang-glog-dev golang-go
  golang-go.opencensus-dev golang-gocapability-dev golang-gogoprotobuf-dev
  golang-golang-x-crypto-dev golang-golang-x-net-dev
  golang-golang-x-oauth2-dev golang-golang-x-oauth2-google-dev
  golang-golang-x-sync-dev golang-golang-x-sys-dev golang-golang-x-text-dev
  golang-golang-x-time-dev golang-golang-x-tools golang-golang-x-tools-dev
  golang-golang-x-xerrors-dev golang-gomega-dev golang-google-api-dev
  golang-google-cloud-compute-metadata-dev golang-google-genproto-dev
  golang-google-grpc-dev golang-gopkg-alecthomas-kingpin.v2-dev
  golang-gopkg-check.v1-dev golang-gopkg-inf.v0-dev golang-gopkg-mgo.v2-dev
  golang-gopkg-square-go-jose.v2-dev golang-gopkg-tomb.v2-dev
  golang-gopkg-vmihailenco-msgpack.v2-dev golang-gopkg-yaml.v2-dev
  golang-goprotobuf-dev golang-procfs-dev golang-protobuf-extensions-dev
  golang-src groff-base intltool-debian iproute2 libarchive-zip-perl libbsd0
  libcroco3 libdebhelper-perl libelf1 libfile-stripnondeterminism-perl
  libglib2.0-0 libicu63 libjs-jquery libjs-jquery-ui libmagic-mgc libmagic1
  libmnl0 libncurses6 libpipeline1 libprocps7 libprotobuf-dev
  libprotobuf-lite17 libprotobuf17 libprotoc17 libsasl2-dev libseccomp-dev
  libsigsegv2 libssl1.1 libsub-override-perl libsystemd-dev libsystemd0
  libtinfo5 libtool libuchardet0 libxml2 libxtables12 lsof m4 man-db mockery
  openssl pkg-config po-debconf procps protobuf-compiler sensible-utils
  zlib1g-dev
Suggested packages:
  autoconf-archive gnu-standards autoconf-doc wamerican | wordlist whois
  vacation dh-make gettext-doc libasprintf-dev libgettextpo-dev bzr | brz git
  mercurial subversion mockgen golang-google-appengine-dev groff iproute2-doc
  libjs-jquery-ui-docs seccomp libtool-doc gfortran | fortran95-compiler
  gcj-jdk m4-doc apparmor less www-browser libmail-box-perl
Recommended packages:
  curl | wget | lynx golang-doc libatm1 libarchive-cpio-perl libglib2.0-data
  shared-mime-info xdg-user-dirs javascript-common libgpm2 libltdl-dev
  libmail-sendmail-perl psmisc
The following NEW packages will be installed:
  autoconf automake autopoint autotools-dev bash-completion bsdmainutils
  ca-certificates debhelper dh-autoreconf dh-golang dh-strip-nondeterminism
  dwz file gettext gettext-base gogoprotobuf golang-1.13-go golang-1.13-src
  golang-any golang-dbus-dev golang-ginkgo-dev
  golang-github-alecthomas-units-dev golang-github-armon-circbuf-dev
  golang-github-armon-go-metrics-dev golang-github-armon-go-radix-dev
  golang-github-asaskevich-govalidator-dev golang-github-aws-aws-sdk-go-dev
  golang-github-azure-go-autorest-dev golang-github-beorn7-perks-dev
  golang-github-bgentry-speakeasy-dev golang-github-boltdb-bolt-dev
  golang-github-bradfitz-gomemcache-dev golang-github-cespare-xxhash-dev
  golang-github-circonus-labs-circonus-gometrics-dev
  golang-github-circonus-labs-circonusllhist-dev
  golang-github-coreos-go-systemd-dev golang-github-coreos-pkg-dev
  golang-github-cyphar-filepath-securejoin-dev
  golang-github-datadog-datadog-go-dev golang-github-davecgh-go-spew-dev
  golang-github-denverdino-aliyungo-dev golang-github-dgrijalva-jwt-go-dev
  golang-github-dgrijalva-jwt-go-v3-dev golang-github-digitalocean-godo-dev
  golang-github-dimchansky-utfbom-dev golang-github-docker-go-connections-dev
  golang-github-docker-go-units-dev golang-github-docopt-docopt-go-dev
  golang-github-elazarl-go-bindata-assetfs-dev golang-github-fatih-color-dev
  golang-github-garyburd-redigo-dev golang-github-ghodss-yaml-dev
  golang-github-go-ini-ini-dev golang-github-go-kit-kit-dev
  golang-github-go-logfmt-logfmt-dev golang-github-go-stack-stack-dev
  golang-github-go-test-deep-dev golang-github-gogo-googleapis-dev
  golang-github-gogo-protobuf-dev golang-github-golang-mock-dev
  golang-github-golang-snappy-dev golang-github-google-btree-dev
  golang-github-google-go-cmp-dev golang-github-google-go-querystring-dev
  golang-github-google-gofuzz-dev golang-github-googleapis-gnostic-dev
  golang-github-gophercloud-gophercloud-dev
  golang-github-gregjones-httpcache-dev golang-github-hashicorp-errwrap-dev
  golang-github-hashicorp-go-checkpoint-dev
  golang-github-hashicorp-go-cleanhttp-dev
  golang-github-hashicorp-go-discover-dev golang-github-hashicorp-go-hclog-dev
  golang-github-hashicorp-go-immutable-radix-dev
  golang-github-hashicorp-go-memdb-dev golang-github-hashicorp-go-msgpack-dev
  golang-github-hashicorp-go-multierror-dev
  golang-github-hashicorp-go-raftchunking-dev
  golang-github-hashicorp-go-reap-dev
  golang-github-hashicorp-go-retryablehttp-dev
  golang-github-hashicorp-go-rootcerts-dev
  golang-github-hashicorp-go-sockaddr-dev
  golang-github-hashicorp-go-syslog-dev golang-github-hashicorp-go-uuid-dev
  golang-github-hashicorp-go-version-dev
  golang-github-hashicorp-golang-lru-dev golang-github-hashicorp-hcl-dev
  golang-github-hashicorp-hil-dev golang-github-hashicorp-logutils-dev
  golang-github-hashicorp-mdns-dev golang-github-hashicorp-memberlist-dev
  golang-github-hashicorp-net-rpc-msgpackrpc-dev
  golang-github-hashicorp-raft-boltdb-dev golang-github-hashicorp-raft-dev
  golang-github-hashicorp-scada-client-dev golang-github-hashicorp-serf-dev
  golang-github-hashicorp-yamux-dev golang-github-imdario-mergo-dev
  golang-github-inconshreveable-muxado-dev golang-github-jeffail-gabs-dev
  golang-github-jefferai-jsonx-dev golang-github-jmespath-go-jmespath-dev
  golang-github-jpillora-backoff-dev golang-github-json-iterator-go-dev
  golang-github-julienschmidt-httprouter-dev golang-github-kr-pretty-dev
  golang-github-kr-pty-dev golang-github-kr-text-dev
  golang-github-mattn-go-colorable-dev golang-github-mattn-go-isatty-dev
  golang-github-miekg-dns-dev golang-github-mitchellh-cli-dev
  golang-github-mitchellh-copystructure-dev
  golang-github-mitchellh-go-homedir-dev
  golang-github-mitchellh-go-testing-interface-dev
  golang-github-mitchellh-hashstructure-dev
  golang-github-mitchellh-mapstructure-dev
  golang-github-mitchellh-reflectwalk-dev
  golang-github-modern-go-concurrent-dev golang-github-modern-go-reflect2-dev
  golang-github-mwitkow-go-conntrack-dev golang-github-nytimes-gziphandler-dev
  golang-github-opencontainers-runc-dev
  golang-github-opencontainers-selinux-dev
  golang-github-opencontainers-specs-dev
  golang-github-opentracing-opentracing-go-dev
  golang-github-packethost-packngo-dev golang-github-pascaldekloe-goe-dev
  golang-github-peterbourgon-diskv-dev golang-github-pkg-errors-dev
  golang-github-pmezard-go-difflib-dev golang-github-posener-complete-dev
  golang-github-prometheus-client-golang-dev
  golang-github-prometheus-client-model-dev
  golang-github-prometheus-common-dev golang-github-ryanuber-columnize-dev
  golang-github-ryanuber-go-glob-dev golang-github-sap-go-hdb-dev
  golang-github-seccomp-libseccomp-golang-dev
  golang-github-shirou-gopsutil-dev golang-github-sirupsen-logrus-dev
  golang-github-spf13-pflag-dev golang-github-stretchr-objx-dev
  golang-github-stretchr-testify-dev golang-github-syndtr-goleveldb-dev
  golang-github-tent-http-link-go-dev golang-github-tv42-httpunix-dev
  golang-github-ugorji-go-codec-dev golang-github-ugorji-go-msgpack-dev
  golang-github-urfave-cli-dev golang-github-vishvananda-netlink-dev
  golang-github-vishvananda-netns-dev golang-github-vmihailenco-tagparser-dev
  golang-github-vmware-govmomi-dev golang-github-xeipuuv-gojsonpointer-dev
  golang-github-xeipuuv-gojsonreference-dev
  golang-github-xeipuuv-gojsonschema-dev golang-glog-dev golang-go
  golang-go.opencensus-dev golang-gocapability-dev golang-gogoprotobuf-dev
  golang-golang-x-crypto-dev golang-golang-x-net-dev
  golang-golang-x-oauth2-dev golang-golang-x-oauth2-google-dev
  golang-golang-x-sync-dev golang-golang-x-sys-dev golang-golang-x-text-dev
  golang-golang-x-time-dev golang-golang-x-tools golang-golang-x-tools-dev
  golang-golang-x-xerrors-dev golang-gomega-dev golang-google-api-dev
  golang-google-cloud-compute-metadata-dev golang-google-genproto-dev
  golang-google-grpc-dev golang-gopkg-alecthomas-kingpin.v2-dev
  golang-gopkg-check.v1-dev golang-gopkg-inf.v0-dev golang-gopkg-mgo.v2-dev
  golang-gopkg-square-go-jose.v2-dev golang-gopkg-tomb.v2-dev
  golang-gopkg-vmihailenco-msgpack.v2-dev golang-gopkg-yaml.v2-dev
  golang-goprotobuf-dev golang-procfs-dev golang-protobuf-extensions-dev
  golang-src groff-base intltool-debian iproute2 libarchive-zip-perl libbsd0
  libcroco3 libdebhelper-perl libelf1 libfile-stripnondeterminism-perl
  libglib2.0-0 libicu63 libjs-jquery libjs-jquery-ui libmagic-mgc libmagic1
  libmnl0 libncurses6 libpipeline1 libprocps7 libprotobuf-dev
  libprotobuf-lite17 libprotobuf17 libprotoc17 libsasl2-dev libseccomp-dev
  libsigsegv2 libssl1.1 libsub-override-perl libsystemd-dev libtinfo5 libtool
  libuchardet0 libxml2 libxtables12 lsof m4 man-db mockery openssl pkg-config
  po-debconf procps protobuf-compiler sbuild-build-depends-consul-dummy
  sensible-utils zlib1g-dev
The following packages will be upgraded:
  libsystemd0
1 upgraded, 236 newly installed, 0 to remove and 28 not upgraded.
Need to get 158 MB of archives.
After this operation, 985 MB of additional disk space will be used.
Get:1 copy:/<<BUILDDIR>>/resolver-aykkKG/apt_archive ./ sbuild-build-depends-consul-dummy 0.invalid.0 [1708 B]
Get:2 http://172.17.0.1/private bullseye-staging/main armhf libsystemd0 armhf 244-3+rpi1+b1 [311 kB]
Get:3 http://172.17.0.1/private bullseye-staging/main armhf libbsd0 armhf 0.10.0-1 [112 kB]
Get:4 http://172.17.0.1/private bullseye-staging/main armhf libtinfo5 armhf 6.1+20191019-1 [316 kB]
Get:5 http://172.17.0.1/private bullseye-staging/main armhf bsdmainutils armhf 11.1.2 [182 kB]
Get:6 http://172.17.0.1/private bullseye-staging/main armhf libuchardet0 armhf 0.0.6-3 [62.2 kB]
Get:7 http://172.17.0.1/private bullseye-staging/main armhf groff-base armhf 1.22.4-4 [783 kB]
Get:8 http://172.17.0.1/private bullseye-staging/main armhf libpipeline1 armhf 1.5.1-3 [28.3 kB]
Get:9 http://172.17.0.1/private bullseye-staging/main armhf man-db armhf 2.9.0-2 [1261 kB]
Get:10 http://172.17.0.1/private bullseye-staging/main armhf golang-github-davecgh-go-spew-dev all 1.1.1-2 [29.7 kB]
Get:11 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pmezard-go-difflib-dev all 1.0.0-2 [12.0 kB]
Get:12 http://172.17.0.1/private bullseye-staging/main armhf golang-github-stretchr-objx-dev all 0.1.1+git20180825.ef50b0d-1 [23.4 kB]
Get:13 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-pty-dev all 1.1.6-1 [10.6 kB]
Get:14 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-text-dev all 0.1.0-1 [10.8 kB]
Get:15 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-pretty-dev all 0.1.0-1 [10.2 kB]
Get:16 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-check.v1-dev all 0.0+git20180628.788fd78-1 [31.6 kB]
Get:17 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-yaml.v2-dev all 2.2.2-1 [58.9 kB]
Get:18 http://172.17.0.1/private bullseye-staging/main armhf golang-github-stretchr-testify-dev all 1.4.0+ds-1 [53.5 kB]
Get:19 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-sys-dev all 0.0~git20190726.fc99dfb-1 [395 kB]
Get:20 http://172.17.0.1/private bullseye-staging/main armhf golang-github-sirupsen-logrus-dev all 1.4.2-1 [41.2 kB]
Get:21 http://172.17.0.1/private bullseye-staging/main armhf libelf1 armhf 0.176-1.1 [158 kB]
Get:22 http://172.17.0.1/private bullseye-staging/main armhf libmnl0 armhf 1.0.4-2 [11.3 kB]
Get:23 http://172.17.0.1/private bullseye-staging/main armhf libxtables12 armhf 1.8.3-2 [77.3 kB]
Get:24 http://172.17.0.1/private bullseye-staging/main armhf iproute2 armhf 5.4.0-1 [762 kB]
Get:25 http://172.17.0.1/private bullseye-staging/main armhf libncurses6 armhf 6.1+20191019-1 [79.5 kB]
Get:26 http://172.17.0.1/private bullseye-staging/main armhf libprocps7 armhf 2:3.3.15-2 [58.9 kB]
Get:27 http://172.17.0.1/private bullseye-staging/main armhf procps armhf 2:3.3.15-2 [235 kB]
Get:28 http://172.17.0.1/private bullseye-staging/main armhf sensible-utils all 0.0.12+nmu1 [16.0 kB]
Get:29 http://172.17.0.1/private bullseye-staging/main armhf bash-completion all 1:2.8-6 [208 kB]
Get:30 http://172.17.0.1/private bullseye-staging/main armhf libmagic-mgc armhf 1:5.37-6 [253 kB]
Get:31 http://172.17.0.1/private bullseye-staging/main armhf libmagic1 armhf 1:5.37-6 [111 kB]
Get:32 http://172.17.0.1/private bullseye-staging/main armhf file armhf 1:5.37-6 [66.2 kB]
Get:33 http://172.17.0.1/private bullseye-staging/main armhf gettext-base armhf 0.19.8.1-10 [117 kB]
Get:34 http://172.17.0.1/private bullseye-staging/main armhf lsof armhf 4.93.2+dfsg-1 [307 kB]
Get:35 http://172.17.0.1/private bullseye-staging/main armhf libsigsegv2 armhf 2.12-2 [32.3 kB]
Get:36 http://172.17.0.1/private bullseye-staging/main armhf m4 armhf 1.4.18-4 [185 kB]
Get:37 http://172.17.0.1/private bullseye-staging/main armhf autoconf all 2.69-11 [341 kB]
Get:38 http://172.17.0.1/private bullseye-staging/main armhf autotools-dev all 20180224.1 [77.0 kB]
Get:39 http://172.17.0.1/private bullseye-staging/main armhf automake all 1:1.16.1-4 [771 kB]
Get:40 http://172.17.0.1/private bullseye-staging/main armhf autopoint all 0.19.8.1-10 [435 kB]
Get:41 http://172.17.0.1/private bullseye-staging/main armhf libssl1.1 armhf 1.1.1d-2 [1268 kB]
Get:42 http://172.17.0.1/private bullseye-staging/main armhf openssl armhf 1.1.1d-2 [806 kB]
Get:43 http://172.17.0.1/private bullseye-staging/main armhf ca-certificates all 20190110 [157 kB]
Get:44 http://172.17.0.1/private bullseye-staging/main armhf libtool all 2.4.6-11 [547 kB]
Get:45 http://172.17.0.1/private bullseye-staging/main armhf dh-autoreconf all 19 [16.9 kB]
Get:46 http://172.17.0.1/private bullseye-staging/main armhf libdebhelper-perl all 12.7.2 [174 kB]
Get:47 http://172.17.0.1/private bullseye-staging/main armhf libarchive-zip-perl all 1.67-1 [104 kB]
Get:48 http://172.17.0.1/private bullseye-staging/main armhf libsub-override-perl all 0.09-2 [10.2 kB]
Get:49 http://172.17.0.1/private bullseye-staging/main armhf libfile-stripnondeterminism-perl all 1.6.3-1 [23.6 kB]
Get:50 http://172.17.0.1/private bullseye-staging/main armhf dh-strip-nondeterminism all 1.6.3-1 [14.6 kB]
Get:51 http://172.17.0.1/private bullseye-staging/main armhf dwz armhf 0.13-5 [142 kB]
Get:52 http://172.17.0.1/private bullseye-staging/main armhf libglib2.0-0 armhf 2.62.3-2 [1137 kB]
Get:53 http://172.17.0.1/private bullseye-staging/main armhf libicu63 armhf 63.2-2 [7974 kB]
Get:54 http://172.17.0.1/private bullseye-staging/main armhf libxml2 armhf 2.9.4+dfsg1-8 [593 kB]
Get:55 http://172.17.0.1/private bullseye-staging/main armhf libcroco3 armhf 0.6.13-1 [133 kB]
Get:56 http://172.17.0.1/private bullseye-staging/main armhf gettext armhf 0.19.8.1-10 [1219 kB]
Get:57 http://172.17.0.1/private bullseye-staging/main armhf intltool-debian all 0.35.0+20060710.5 [26.8 kB]
Get:58 http://172.17.0.1/private bullseye-staging/main armhf po-debconf all 1.0.21 [248 kB]
Get:59 http://172.17.0.1/private bullseye-staging/main armhf debhelper all 12.7.2 [1018 kB]
Get:60 http://172.17.0.1/private bullseye-staging/main armhf dh-golang all 1.43 [22.4 kB]
Get:61 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gogo-protobuf-dev all 1.2.1+git20190611.dadb6258-1 [863 kB]
Get:62 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf17 armhf 3.6.1.3-2+rpi1 [665 kB]
Get:63 http://172.17.0.1/private bullseye-staging/main armhf libprotoc17 armhf 3.6.1.3-2+rpi1 [546 kB]
Get:64 http://172.17.0.1/private bullseye-staging/main armhf protobuf-compiler armhf 3.6.1.3-2+rpi1 [64.5 kB]
Get:65 http://172.17.0.1/private bullseye-staging/main armhf gogoprotobuf armhf 1.2.1+git20190611.dadb6258-1 [5285 kB]
Get:66 http://172.17.0.1/private bullseye-staging/main armhf golang-1.13-src armhf 1.13.5-1+rpi1 [12.7 MB]
Get:67 http://172.17.0.1/private bullseye-staging/main armhf golang-1.13-go armhf 1.13.5-1+rpi1 [43.5 MB]
Get:68 http://172.17.0.1/private bullseye-staging/main armhf golang-src armhf 2:1.13~1+b14 [4896 B]
Get:69 http://172.17.0.1/private bullseye-staging/main armhf golang-go armhf 2:1.13~1+b14 [23.9 kB]
Get:70 http://172.17.0.1/private bullseye-staging/main armhf golang-any armhf 2:1.13~1+b14 [5012 B]
Get:71 http://172.17.0.1/private bullseye-staging/main armhf golang-dbus-dev all 5.0.3-1 [55.6 kB]
Get:72 http://172.17.0.1/private bullseye-staging/main armhf golang-github-alecthomas-units-dev all 0.0~git20151022.0.2efee85-4 [5816 B]
Get:73 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-circbuf-dev all 0.0~git20150827.0.bbbad09-2 [3952 B]
Get:74 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pkg-errors-dev all 0.8.1-1 [11.2 kB]
Get:75 http://172.17.0.1/private bullseye-staging/main armhf golang-github-circonus-labs-circonusllhist-dev all 0.0~git20160526.0.d724266-2 [6974 B]
Get:76 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-cleanhttp-dev all 0.5.1-1 [10.4 kB]
Get:77 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mattn-go-isatty-dev all 0.0.8-2 [5864 B]
Get:78 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mattn-go-colorable-dev all 0.0.9-3 [7960 B]
Get:79 http://172.17.0.1/private bullseye-staging/main armhf golang-github-fatih-color-dev all 1.7.0-1 [11.4 kB]
Get:80 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-hclog-dev all 0.10.1-1 [17.9 kB]
Get:81 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-retryablehttp-dev all 0.6.4-1 [17.3 kB]
Get:82 http://172.17.0.1/private bullseye-staging/main armhf golang-github-tv42-httpunix-dev all 0.0~git20150427.b75d861-2 [3744 B]
Get:83 http://172.17.0.1/private bullseye-staging/main armhf golang-github-circonus-labs-circonus-gometrics-dev all 2.3.1-2 [64.4 kB]
Get:84 http://172.17.0.1/private bullseye-staging/main armhf golang-github-datadog-datadog-go-dev all 2.1.0-2 [14.7 kB]
Get:85 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-uuid-dev all 1.0.1-1 [8476 B]
Get:86 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-golang-lru-dev all 0.5.3-1 [14.4 kB]
Get:87 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-immutable-radix-dev all 1.1.0-1 [22.8 kB]
Get:88 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pascaldekloe-goe-dev all 0.1.0-2 [21.7 kB]
Get:89 http://172.17.0.1/private bullseye-staging/main armhf golang-github-beorn7-perks-dev all 0.0~git20160804.0.4c0e845-1 [11.6 kB]
Get:90 http://172.17.0.1/private bullseye-staging/main armhf golang-github-cespare-xxhash-dev all 2.1.1-1 [8748 B]
Get:91 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-gofuzz-dev all 0.0~git20170612.24818f7-1 [9108 B]
Get:92 http://172.17.0.1/private bullseye-staging/main armhf golang-github-modern-go-concurrent-dev all 1.0.3-1 [4520 B]
Get:93 http://172.17.0.1/private bullseye-staging/main armhf golang-github-modern-go-reflect2-dev all 1.0.0-1 [14.4 kB]
Get:94 http://172.17.0.1/private bullseye-staging/main armhf golang-github-json-iterator-go-dev all 1.1.4-1 [62.6 kB]
Get:95 http://172.17.0.1/private bullseye-staging/main armhf zlib1g-dev armhf 1:1.2.11.dfsg-1 [206 kB]
Get:96 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf-lite17 armhf 3.6.1.3-2+rpi1 [147 kB]
Get:97 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf-dev armhf 3.6.1.3-2+rpi1 [1001 kB]
Get:98 http://172.17.0.1/private bullseye-staging/main armhf golang-goprotobuf-dev armhf 1.3.2-2 [1369 kB]
Get:99 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-client-model-dev all 0.0.2+git20171117.99fa1f4-1 [19.3 kB]
Get:100 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dgrijalva-jwt-go-v3-dev all 3.2.0-2 [32.4 kB]
Get:101 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-logfmt-logfmt-dev all 0.3.0-1 [12.5 kB]
Get:102 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-stack-stack-dev all 1.5.2-2 [6956 B]
Get:103 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-sync-dev all 0.0~git20190423.1122301-1 [17.1 kB]
Get:104 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-xerrors-dev all 0.0~git20190717.a985d34-1 [12.8 kB]
Get:105 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-tools-dev all 1:0.0~git20191118.07fc4c7+ds-1 [1396 kB]
Get:106 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-text-dev all 0.3.2-1 [3689 kB]
Get:107 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-net-dev all 1:0.0+git20191112.2180aed+dfsg-1 [637 kB]
Get:108 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opentracing-opentracing-go-dev all 1.0.2-1 [21.8 kB]
Get:109 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-time-dev all 0.0~git20161028.0.f51c127-2 [9396 B]
Get:110 http://172.17.0.1/private bullseye-staging/main armhf golang-github-golang-mock-dev all 1.3.1-2 [35.1 kB]
Get:111 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-go-cmp-dev all 0.3.1-1 [65.2 kB]
Get:112 http://172.17.0.1/private bullseye-staging/main armhf golang-glog-dev all 0.0~git20160126.23def4e-3 [17.3 kB]
Get:113 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-oauth2-dev all 0.0~git20190604.0f29369-2 [31.9 kB]
Get:114 http://172.17.0.1/private bullseye-staging/main armhf golang-google-cloud-compute-metadata-dev all 0.43.0-1 [31.1 kB]
Get:115 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-oauth2-google-dev all 0.0~git20190604.0f29369-2 [13.2 kB]
Get:116 http://172.17.0.1/private bullseye-staging/main armhf golang-google-genproto-dev all 0.0~git20190801.fa694d8-2 [2897 kB]
Get:117 http://172.17.0.1/private bullseye-staging/main armhf golang-google-grpc-dev all 1.22.1-1 [493 kB]
Get:118 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-kit-kit-dev all 0.6.0-2 [103 kB]
Get:119 http://172.17.0.1/private bullseye-staging/main armhf golang-github-julienschmidt-httprouter-dev all 1.1-5 [16.0 kB]
Get:120 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jpillora-backoff-dev all 1.0.0-1 [3580 B]
Get:121 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mwitkow-go-conntrack-dev all 0.0~git20190716.2f06839-1 [14.4 kB]
Get:122 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-alecthomas-kingpin.v2-dev all 2.2.6-1 [42.2 kB]
Get:123 http://172.17.0.1/private bullseye-staging/main armhf golang-protobuf-extensions-dev all 1.0.1-1 [29.6 kB]
Get:124 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-common-dev all 0.7.0-1 [83.8 kB]
Get:125 http://172.17.0.1/private bullseye-staging/main armhf golang-procfs-dev all 0.0.3-1 [78.0 kB]
Get:126 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-client-golang-dev all 1.2.1-3 [106 kB]
Get:127 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-go-metrics-dev all 0.3.0-1 [26.7 kB]
Get:128 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-go-radix-dev all 1.0.0-1 [7420 B]
Get:129 http://172.17.0.1/private bullseye-staging/main armhf golang-github-asaskevich-govalidator-dev all 9+git20180720.0.f9ffefc3-1 [41.2 kB]
Get:130 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-ini-ini-dev all 1.32.0-2 [32.7 kB]
Get:131 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jmespath-go-jmespath-dev all 0.2.2-3 [18.7 kB]
Get:132 http://172.17.0.1/private bullseye-staging/main armhf golang-github-aws-aws-sdk-go-dev all 1.21.6+dfsg-2 [4969 kB]
Get:133 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dgrijalva-jwt-go-dev all 3.2.0-1 [32.5 kB]
Get:134 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dimchansky-utfbom-dev all 0.0~git20170328.6c6132f-1 [4712 B]
Get:135 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-go-homedir-dev all 1.1.0-1 [5168 B]
Get:136 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-crypto-dev all 1:0.0~git20190701.4def268-2 [1505 kB]
Get:137 http://172.17.0.1/private bullseye-staging/main armhf golang-github-azure-go-autorest-dev all 10.15.5-1 [99.2 kB]
Get:138 http://172.17.0.1/private bullseye-staging/main armhf golang-github-bgentry-speakeasy-dev all 0.1.0-1 [5110 B]
Get:139 http://172.17.0.1/private bullseye-staging/main armhf golang-github-boltdb-bolt-dev all 1.3.1-6 [60.6 kB]
Get:140 http://172.17.0.1/private bullseye-staging/main armhf golang-github-bradfitz-gomemcache-dev all 0.0~git20141109-3 [10.3 kB]
Get:141 http://172.17.0.1/private bullseye-staging/main armhf golang-github-coreos-pkg-dev all 4-2 [25.1 kB]
Get:142 http://172.17.0.1/private bullseye-staging/main armhf libsystemd-dev armhf 244-3+rpi1+b1 [333 kB]
Get:143 http://172.17.0.1/private bullseye-staging/main armhf pkg-config armhf 0.29-6 [59.8 kB]
Get:144 http://172.17.0.1/private bullseye-staging/main armhf golang-github-coreos-go-systemd-dev all 22.0.0-1 [51.7 kB]
Get:145 http://172.17.0.1/private bullseye-staging/main armhf golang-github-cyphar-filepath-securejoin-dev all 0.2.2-1 [7196 B]
Get:146 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-go-querystring-dev all 1.0.0-1 [7456 B]
Get:147 http://172.17.0.1/private bullseye-staging/main armhf golang-github-tent-http-link-go-dev all 0.0~git20130702.0.ac974c6-6 [5016 B]
Get:148 http://172.17.0.1/private bullseye-staging/main armhf golang-github-digitalocean-godo-dev all 1.1.0-1 [42.6 kB]
Get:149 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docker-go-units-dev all 0.4.0-1 [7536 B]
Get:150 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-selinux-dev all 1.3.0-2 [13.3 kB]
Get:151 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonpointer-dev all 0.0~git20151027.0.e0fe6f6-2 [4620 B]
Get:152 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonreference-dev all 0.0~git20150808.0.e02fc20-2 [4592 B]
Get:153 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonschema-dev all 0.0~git20170210.0.6b67b3f-2 [25.3 kB]
Get:154 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-specs-dev all 1.0.1+git20190408.a1b50f6-1 [27.7 kB]
Get:155 http://172.17.0.1/private bullseye-staging/main armhf libseccomp-dev armhf 2.4.2-2+rpi1 [69.5 kB]
Get:156 http://172.17.0.1/private bullseye-staging/main armhf golang-github-seccomp-libseccomp-golang-dev all 0.9.1-1 [16.1 kB]
Get:157 http://172.17.0.1/private bullseye-staging/main armhf golang-github-urfave-cli-dev all 1.20.0-1 [51.0 kB]
Get:158 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vishvananda-netns-dev all 0.0~git20170707.0.86bef33-1 [5646 B]
Get:159 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vishvananda-netlink-dev all 1.0.0+git20181030.023a6da-1 [106 kB]
Get:160 http://172.17.0.1/private bullseye-staging/main armhf golang-gocapability-dev all 0.0+git20180916.d983527-1 [11.8 kB]
Get:161 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-runc-dev all 1.0.0~rc9+dfsg1-1+rpi1 [178 kB]
Get:162 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docker-go-connections-dev all 0.4.0-1 [26.3 kB]
Get:163 http://172.17.0.1/private bullseye-staging/main armhf golang-github-elazarl-go-bindata-assetfs-dev all 1.0.0-1 [5460 B]
Get:164 http://172.17.0.1/private bullseye-staging/main armhf golang-github-garyburd-redigo-dev all 0.0~git20150901.0.d8dbe4d-2 [28.0 kB]
Get:165 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ghodss-yaml-dev all 1.0.0-1 [12.9 kB]
Get:166 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-test-deep-dev all 1.0.3-1 [9876 B]
Get:167 http://172.17.0.1/private bullseye-staging/main armhf golang-gogoprotobuf-dev all 1.2.1+git20190611.dadb6258-1 [5340 B]
Get:168 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gogo-googleapis-dev all 1.2.0-1 [30.4 kB]
Get:169 http://172.17.0.1/private bullseye-staging/main armhf golang-github-golang-snappy-dev all 0.0+git20160529.d9eb7a3-3 [51.2 kB]
Get:170 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-btree-dev all 1.0.0-1 [13.2 kB]
Get:171 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docopt-docopt-go-dev all 0.6.2+git20160216.0.784ddc5-1 [9434 B]
Get:172 http://172.17.0.1/private bullseye-staging/main armhf golang-github-googleapis-gnostic-dev all 0.2.0-1 [74.4 kB]
Get:173 http://172.17.0.1/private bullseye-staging/main armhf golang-github-peterbourgon-diskv-dev all 3.0.0-1 [18.8 kB]
Get:174 http://172.17.0.1/private bullseye-staging/main armhf golang-gomega-dev all 1.0+git20160910.d59fa0a-1 [63.7 kB]
Get:175 http://172.17.0.1/private bullseye-staging/main armhf golang-ginkgo-dev armhf 1.2.0+git20161006.acfa16a-1 [1535 kB]
Get:176 http://172.17.0.1/private bullseye-staging/main armhf golang-github-syndtr-goleveldb-dev all 0.0~git20170725.0.b89cc31-2 [116 kB]
Get:177 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gregjones-httpcache-dev all 0.0~git20180305.9cad4c3-1 [13.6 kB]
Get:178 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-errwrap-dev all 1.0.0-1 [10.3 kB]
Get:179 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-checkpoint-dev all 0.0~git20171009.1545e56-2 [8184 B]
Get:180 http://172.17.0.1/private bullseye-staging/main armhf golang-github-denverdino-aliyungo-dev all 0.0~git20180921.13fa8aa-2 [125 kB]
Get:181 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gophercloud-gophercloud-dev all 0.6.0-1 [570 kB]
Get:182 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-multierror-dev all 1.0.0-1 [10.6 kB]
Get:183 http://172.17.0.1/private bullseye-staging/main armhf golang-github-miekg-dns-dev all 1.0.4+ds-1 [126 kB]
Get:184 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-mdns-dev all 1.0.1-1 [11.9 kB]
Get:185 http://172.17.0.1/private bullseye-staging/main armhf golang-github-packethost-packngo-dev all 0.2.0-2 [40.7 kB]
Get:186 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vmware-govmomi-dev all 0.15.0-1 [10.2 MB]
Get:187 http://172.17.0.1/private bullseye-staging/main armhf golang-go.opencensus-dev all 0.22.0-1 [120 kB]
Get:188 http://172.17.0.1/private bullseye-staging/main armhf golang-google-api-dev all 0.7.0-2 [2971 kB]
Get:189 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-discover-dev all 0.0+git20190905.34a6505-2 [26.7 kB]
Get:190 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-memdb-dev all 0.0~git20180224.1289e7ff-1 [27.1 kB]
Get:191 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ugorji-go-codec-dev all 1.1.7-1 [201 kB]
Get:192 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ugorji-go-msgpack-dev all 0.0~git20130605.792643-5 [20.7 kB]
Get:193 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vmihailenco-tagparser-dev all 0.1.1-2 [4440 B]
Get:194 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-vmihailenco-msgpack.v2-dev all 4.2.2-1 [27.2 kB]
Get:195 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-tomb.v2-dev all 0.0~git20161208.d5d1b58-3 [6840 B]
Get:196 http://172.17.0.1/private bullseye-staging/main armhf libsasl2-dev armhf 2.1.27+dfsg-1+b1 [255 kB]
Get:197 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-mgo.v2-dev all 2016.08.01-6 [316 kB]
Get:198 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-msgpack-dev all 0.5.5-1 [43.3 kB]
Get:199 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-raft-dev all 1.1.1-5 [88.7 kB]
Get:200 http://172.17.0.1/private bullseye-staging/main armhf libjs-jquery all 3.3.1~dfsg-3 [332 kB]
Get:201 http://172.17.0.1/private bullseye-staging/main armhf libjs-jquery-ui all 1.12.1+dfsg-5 [232 kB]
Get:202 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-tools armhf 1:0.0~git20191118.07fc4c7+ds-1 [28.9 MB]
Get:203 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-reflectwalk-dev all 0.0~git20170726.63d60e9-4 [7868 B]
Get:204 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-copystructure-dev all 0.0~git20161013.0.5af94ae-2 [8704 B]
Get:205 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-raftchunking-dev all 0.6.2-2 [12.3 kB]
Get:206 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-reap-dev all 0.0~git20160113.0.2d85522-3 [9334 B]
Get:207 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-sockaddr-dev all 0.0~git20170627.41949a1+ds-2 [62.7 kB]
Get:208 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-version-dev all 1.2.0-1 [13.8 kB]
Get:209 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-hcl-dev all 1.0.0-1 [58.5 kB]
Get:210 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-mapstructure-dev all 1.1.2-1 [21.1 kB]
Get:211 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-hil-dev all 0.0~git20160711.1e86c6b-1 [32.6 kB]
Get:212 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-memberlist-dev all 0.1.5-2 [74.8 kB]
Get:213 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-raft-boltdb-dev all 0.0~git20171010.6e5ba93-3 [11.1 kB]
Get:214 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-net-rpc-msgpackrpc-dev all 0.0~git20151116.0.a14192a-1 [4168 B]
Get:215 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-yamux-dev all 0.0+git20190923.df201c7-1 [22.0 kB]
Get:216 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-scada-client-dev all 0.0~git20160601.0.6e89678-2 [19.3 kB]
Get:217 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-syslog-dev all 0.0~git20150218.0.42a2b57-1 [5336 B]
Get:218 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-logutils-dev all 0.0~git20150609.0.0dc08b1-1 [8150 B]
Get:219 http://172.17.0.1/private bullseye-staging/main armhf golang-github-posener-complete-dev all 1.1+git20180108.57878c9-3 [17.6 kB]
Get:220 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-cli-dev all 1.0.0-1 [23.8 kB]
Get:221 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ryanuber-columnize-dev all 2.1.1-1 [6600 B]
Get:222 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-serf-dev all 0.8.5~ds1-1 [127 kB]
Get:223 http://172.17.0.1/private bullseye-staging/main armhf golang-github-imdario-mergo-dev all 0.3.5-1 [16.4 kB]
Get:224 http://172.17.0.1/private bullseye-staging/main armhf golang-github-inconshreveable-muxado-dev all 0.0~git20140312.0.f693c7e-2 [26.5 kB]
Get:225 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jeffail-gabs-dev all 2.3.0-1 [16.9 kB]
Get:226 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jefferai-jsonx-dev all 1.0.1-2 [4552 B]
Get:227 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-go-testing-interface-dev all 1.0.0-1 [4268 B]
Get:228 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-hashstructure-dev all 1.0.0-1 [7400 B]
Get:229 http://172.17.0.1/private bullseye-staging/main armhf golang-github-nytimes-gziphandler-dev all 1.1.1-1 [39.9 kB]
Get:230 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ryanuber-go-glob-dev all 1.0.0-2 [4588 B]
Get:231 http://172.17.0.1/private bullseye-staging/main armhf golang-github-sap-go-hdb-dev all 0.14.1-2 [61.9 kB]
Get:232 http://172.17.0.1/private bullseye-staging/main armhf golang-github-shirou-gopsutil-dev all 2.18.06-1 [89.3 kB]
Get:233 http://172.17.0.1/private bullseye-staging/main armhf golang-github-spf13-pflag-dev all 1.0.3-1 [38.0 kB]
Get:234 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-inf.v0-dev all 0.9.0-3 [14.0 kB]
Get:235 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-square-go-jose.v2-dev all 2.4.1-1 [264 kB]
Get:236 http://172.17.0.1/private bullseye-staging/main armhf mockery armhf 0.0~git20181123.e78b021-2 [1598 kB]
Get:237 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-rootcerts-dev all 0.0~git20160503.0.6bb64b3-1 [7336 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 158 MB in 56s (2827 kB/s)
(Reading database ... 12199 files and directories currently installed.)
Preparing to unpack .../libsystemd0_244-3+rpi1+b1_armhf.deb ...
Unpacking libsystemd0:armhf (244-3+rpi1+b1) over (243-8+rpi1) ...
Setting up libsystemd0:armhf (244-3+rpi1+b1) ...
Selecting previously unselected package libbsd0:armhf.
(Reading database ... 12200 files and directories currently installed.)
Preparing to unpack .../000-libbsd0_0.10.0-1_armhf.deb ...
Unpacking libbsd0:armhf (0.10.0-1) ...
Selecting previously unselected package libtinfo5:armhf.
Preparing to unpack .../001-libtinfo5_6.1+20191019-1_armhf.deb ...
Unpacking libtinfo5:armhf (6.1+20191019-1) ...
Selecting previously unselected package bsdmainutils.
Preparing to unpack .../002-bsdmainutils_11.1.2_armhf.deb ...
Unpacking bsdmainutils (11.1.2) ...
Selecting previously unselected package libuchardet0:armhf.
Preparing to unpack .../003-libuchardet0_0.0.6-3_armhf.deb ...
Unpacking libuchardet0:armhf (0.0.6-3) ...
Selecting previously unselected package groff-base.
Preparing to unpack .../004-groff-base_1.22.4-4_armhf.deb ...
Unpacking groff-base (1.22.4-4) ...
Selecting previously unselected package libpipeline1:armhf.
Preparing to unpack .../005-libpipeline1_1.5.1-3_armhf.deb ...
Unpacking libpipeline1:armhf (1.5.1-3) ...
Selecting previously unselected package man-db.
Preparing to unpack .../006-man-db_2.9.0-2_armhf.deb ...
Unpacking man-db (2.9.0-2) ...
Selecting previously unselected package golang-github-davecgh-go-spew-dev.
Preparing to unpack .../007-golang-github-davecgh-go-spew-dev_1.1.1-2_all.deb ...
Unpacking golang-github-davecgh-go-spew-dev (1.1.1-2) ...
Selecting previously unselected package golang-github-pmezard-go-difflib-dev.
Preparing to unpack .../008-golang-github-pmezard-go-difflib-dev_1.0.0-2_all.deb ...
Unpacking golang-github-pmezard-go-difflib-dev (1.0.0-2) ...
Selecting previously unselected package golang-github-stretchr-objx-dev.
Preparing to unpack .../009-golang-github-stretchr-objx-dev_0.1.1+git20180825.ef50b0d-1_all.deb ...
Unpacking golang-github-stretchr-objx-dev (0.1.1+git20180825.ef50b0d-1) ...
Selecting previously unselected package golang-github-kr-pty-dev.
Preparing to unpack .../010-golang-github-kr-pty-dev_1.1.6-1_all.deb ...
Unpacking golang-github-kr-pty-dev (1.1.6-1) ...
Selecting previously unselected package golang-github-kr-text-dev.
Preparing to unpack .../011-golang-github-kr-text-dev_0.1.0-1_all.deb ...
Unpacking golang-github-kr-text-dev (0.1.0-1) ...
Selecting previously unselected package golang-github-kr-pretty-dev.
Preparing to unpack .../012-golang-github-kr-pretty-dev_0.1.0-1_all.deb ...
Unpacking golang-github-kr-pretty-dev (0.1.0-1) ...
Selecting previously unselected package golang-gopkg-check.v1-dev.
Preparing to unpack .../013-golang-gopkg-check.v1-dev_0.0+git20180628.788fd78-1_all.deb ...
Unpacking golang-gopkg-check.v1-dev (0.0+git20180628.788fd78-1) ...
Selecting previously unselected package golang-gopkg-yaml.v2-dev.
Preparing to unpack .../014-golang-gopkg-yaml.v2-dev_2.2.2-1_all.deb ...
Unpacking golang-gopkg-yaml.v2-dev (2.2.2-1) ...
Selecting previously unselected package golang-github-stretchr-testify-dev.
Preparing to unpack .../015-golang-github-stretchr-testify-dev_1.4.0+ds-1_all.deb ...
Unpacking golang-github-stretchr-testify-dev (1.4.0+ds-1) ...
Selecting previously unselected package golang-golang-x-sys-dev.
Preparing to unpack .../016-golang-golang-x-sys-dev_0.0~git20190726.fc99dfb-1_all.deb ...
Unpacking golang-golang-x-sys-dev (0.0~git20190726.fc99dfb-1) ...
Selecting previously unselected package golang-github-sirupsen-logrus-dev.
Preparing to unpack .../017-golang-github-sirupsen-logrus-dev_1.4.2-1_all.deb ...
Unpacking golang-github-sirupsen-logrus-dev (1.4.2-1) ...
Selecting previously unselected package libelf1:armhf.
Preparing to unpack .../018-libelf1_0.176-1.1_armhf.deb ...
Unpacking libelf1:armhf (0.176-1.1) ...
Selecting previously unselected package libmnl0:armhf.
Preparing to unpack .../019-libmnl0_1.0.4-2_armhf.deb ...
Unpacking libmnl0:armhf (1.0.4-2) ...
Selecting previously unselected package libxtables12:armhf.
Preparing to unpack .../020-libxtables12_1.8.3-2_armhf.deb ...
Unpacking libxtables12:armhf (1.8.3-2) ...
Selecting previously unselected package iproute2.
Preparing to unpack .../021-iproute2_5.4.0-1_armhf.deb ...
Unpacking iproute2 (5.4.0-1) ...
Selecting previously unselected package libncurses6:armhf.
Preparing to unpack .../022-libncurses6_6.1+20191019-1_armhf.deb ...
Unpacking libncurses6:armhf (6.1+20191019-1) ...
Selecting previously unselected package libprocps7:armhf.
Preparing to unpack .../023-libprocps7_2%3a3.3.15-2_armhf.deb ...
Unpacking libprocps7:armhf (2:3.3.15-2) ...
Selecting previously unselected package procps.
Preparing to unpack .../024-procps_2%3a3.3.15-2_armhf.deb ...
Unpacking procps (2:3.3.15-2) ...
Selecting previously unselected package sensible-utils.
Preparing to unpack .../025-sensible-utils_0.0.12+nmu1_all.deb ...
Unpacking sensible-utils (0.0.12+nmu1) ...
Selecting previously unselected package bash-completion.
Preparing to unpack .../026-bash-completion_1%3a2.8-6_all.deb ...
Unpacking bash-completion (1:2.8-6) ...
Selecting previously unselected package libmagic-mgc.
Preparing to unpack .../027-libmagic-mgc_1%3a5.37-6_armhf.deb ...
Unpacking libmagic-mgc (1:5.37-6) ...
Selecting previously unselected package libmagic1:armhf.
Preparing to unpack .../028-libmagic1_1%3a5.37-6_armhf.deb ...
Unpacking libmagic1:armhf (1:5.37-6) ...
Selecting previously unselected package file.
Preparing to unpack .../029-file_1%3a5.37-6_armhf.deb ...
Unpacking file (1:5.37-6) ...
Selecting previously unselected package gettext-base.
Preparing to unpack .../030-gettext-base_0.19.8.1-10_armhf.deb ...
Unpacking gettext-base (0.19.8.1-10) ...
Selecting previously unselected package lsof.
Preparing to unpack .../031-lsof_4.93.2+dfsg-1_armhf.deb ...
Unpacking lsof (4.93.2+dfsg-1) ...
Selecting previously unselected package libsigsegv2:armhf.
Preparing to unpack .../032-libsigsegv2_2.12-2_armhf.deb ...
Unpacking libsigsegv2:armhf (2.12-2) ...
Selecting previously unselected package m4.
Preparing to unpack .../033-m4_1.4.18-4_armhf.deb ...
Unpacking m4 (1.4.18-4) ...
Selecting previously unselected package autoconf.
Preparing to unpack .../034-autoconf_2.69-11_all.deb ...
Unpacking autoconf (2.69-11) ...
Selecting previously unselected package autotools-dev.
Preparing to unpack .../035-autotools-dev_20180224.1_all.deb ...
Unpacking autotools-dev (20180224.1) ...
Selecting previously unselected package automake.
Preparing to unpack .../036-automake_1%3a1.16.1-4_all.deb ...
Unpacking automake (1:1.16.1-4) ...
Selecting previously unselected package autopoint.
Preparing to unpack .../037-autopoint_0.19.8.1-10_all.deb ...
Unpacking autopoint (0.19.8.1-10) ...
Selecting previously unselected package libssl1.1:armhf.
Preparing to unpack .../038-libssl1.1_1.1.1d-2_armhf.deb ...
Unpacking libssl1.1:armhf (1.1.1d-2) ...
Selecting previously unselected package openssl.
Preparing to unpack .../039-openssl_1.1.1d-2_armhf.deb ...
Unpacking openssl (1.1.1d-2) ...
Selecting previously unselected package ca-certificates.
Preparing to unpack .../040-ca-certificates_20190110_all.deb ...
Unpacking ca-certificates (20190110) ...
Selecting previously unselected package libtool.
Preparing to unpack .../041-libtool_2.4.6-11_all.deb ...
Unpacking libtool (2.4.6-11) ...
Selecting previously unselected package dh-autoreconf.
Preparing to unpack .../042-dh-autoreconf_19_all.deb ...
Unpacking dh-autoreconf (19) ...
Selecting previously unselected package libdebhelper-perl.
Preparing to unpack .../043-libdebhelper-perl_12.7.2_all.deb ...
Unpacking libdebhelper-perl (12.7.2) ...
Selecting previously unselected package libarchive-zip-perl.
Preparing to unpack .../044-libarchive-zip-perl_1.67-1_all.deb ...
Unpacking libarchive-zip-perl (1.67-1) ...
Selecting previously unselected package libsub-override-perl.
Preparing to unpack .../045-libsub-override-perl_0.09-2_all.deb ...
Unpacking libsub-override-perl (0.09-2) ...
Selecting previously unselected package libfile-stripnondeterminism-perl.
Preparing to unpack .../046-libfile-stripnondeterminism-perl_1.6.3-1_all.deb ...
Unpacking libfile-stripnondeterminism-perl (1.6.3-1) ...
Selecting previously unselected package dh-strip-nondeterminism.
Preparing to unpack .../047-dh-strip-nondeterminism_1.6.3-1_all.deb ...
Unpacking dh-strip-nondeterminism (1.6.3-1) ...
Selecting previously unselected package dwz.
Preparing to unpack .../048-dwz_0.13-5_armhf.deb ...
Unpacking dwz (0.13-5) ...
Selecting previously unselected package libglib2.0-0:armhf.
Preparing to unpack .../049-libglib2.0-0_2.62.3-2_armhf.deb ...
Unpacking libglib2.0-0:armhf (2.62.3-2) ...
Selecting previously unselected package libicu63:armhf.
Preparing to unpack .../050-libicu63_63.2-2_armhf.deb ...
Unpacking libicu63:armhf (63.2-2) ...
Selecting previously unselected package libxml2:armhf.
Preparing to unpack .../051-libxml2_2.9.4+dfsg1-8_armhf.deb ...
Unpacking libxml2:armhf (2.9.4+dfsg1-8) ...
Selecting previously unselected package libcroco3:armhf.
Preparing to unpack .../052-libcroco3_0.6.13-1_armhf.deb ...
Unpacking libcroco3:armhf (0.6.13-1) ...
Selecting previously unselected package gettext.
Preparing to unpack .../053-gettext_0.19.8.1-10_armhf.deb ...
Unpacking gettext (0.19.8.1-10) ...
Selecting previously unselected package intltool-debian.
Preparing to unpack .../054-intltool-debian_0.35.0+20060710.5_all.deb ...
Unpacking intltool-debian (0.35.0+20060710.5) ...
Selecting previously unselected package po-debconf.
Preparing to unpack .../055-po-debconf_1.0.21_all.deb ...
Unpacking po-debconf (1.0.21) ...
Selecting previously unselected package debhelper.
Preparing to unpack .../056-debhelper_12.7.2_all.deb ...
Unpacking debhelper (12.7.2) ...
Selecting previously unselected package dh-golang.
Preparing to unpack .../057-dh-golang_1.43_all.deb ...
Unpacking dh-golang (1.43) ...
Selecting previously unselected package golang-github-gogo-protobuf-dev.
Preparing to unpack .../058-golang-github-gogo-protobuf-dev_1.2.1+git20190611.dadb6258-1_all.deb ...
Unpacking golang-github-gogo-protobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package libprotobuf17:armhf.
Preparing to unpack .../059-libprotobuf17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package libprotoc17:armhf.
Preparing to unpack .../060-libprotoc17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotoc17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package protobuf-compiler.
Preparing to unpack .../061-protobuf-compiler_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking protobuf-compiler (3.6.1.3-2+rpi1) ...
Selecting previously unselected package gogoprotobuf.
Preparing to unpack .../062-gogoprotobuf_1.2.1+git20190611.dadb6258-1_armhf.deb ...
Unpacking gogoprotobuf (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package golang-1.13-src.
Preparing to unpack .../063-golang-1.13-src_1.13.5-1+rpi1_armhf.deb ...
Unpacking golang-1.13-src (1.13.5-1+rpi1) ...
Selecting previously unselected package golang-1.13-go.
Preparing to unpack .../064-golang-1.13-go_1.13.5-1+rpi1_armhf.deb ...
Unpacking golang-1.13-go (1.13.5-1+rpi1) ...
Selecting previously unselected package golang-src.
Preparing to unpack .../065-golang-src_2%3a1.13~1+b14_armhf.deb ...
Unpacking golang-src (2:1.13~1+b14) ...
Selecting previously unselected package golang-go.
Preparing to unpack .../066-golang-go_2%3a1.13~1+b14_armhf.deb ...
Unpacking golang-go (2:1.13~1+b14) ...
Selecting previously unselected package golang-any.
Preparing to unpack .../067-golang-any_2%3a1.13~1+b14_armhf.deb ...
Unpacking golang-any (2:1.13~1+b14) ...
Selecting previously unselected package golang-dbus-dev.
Preparing to unpack .../068-golang-dbus-dev_5.0.3-1_all.deb ...
Unpacking golang-dbus-dev (5.0.3-1) ...
Selecting previously unselected package golang-github-alecthomas-units-dev.
Preparing to unpack .../069-golang-github-alecthomas-units-dev_0.0~git20151022.0.2efee85-4_all.deb ...
Unpacking golang-github-alecthomas-units-dev (0.0~git20151022.0.2efee85-4) ...
Selecting previously unselected package golang-github-armon-circbuf-dev.
Preparing to unpack .../070-golang-github-armon-circbuf-dev_0.0~git20150827.0.bbbad09-2_all.deb ...
Unpacking golang-github-armon-circbuf-dev (0.0~git20150827.0.bbbad09-2) ...
Selecting previously unselected package golang-github-pkg-errors-dev.
Preparing to unpack .../071-golang-github-pkg-errors-dev_0.8.1-1_all.deb ...
Unpacking golang-github-pkg-errors-dev (0.8.1-1) ...
Selecting previously unselected package golang-github-circonus-labs-circonusllhist-dev.
Preparing to unpack .../072-golang-github-circonus-labs-circonusllhist-dev_0.0~git20160526.0.d724266-2_all.deb ...
Unpacking golang-github-circonus-labs-circonusllhist-dev (0.0~git20160526.0.d724266-2) ...
Selecting previously unselected package golang-github-hashicorp-go-cleanhttp-dev.
Preparing to unpack .../073-golang-github-hashicorp-go-cleanhttp-dev_0.5.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-cleanhttp-dev (0.5.1-1) ...
Selecting previously unselected package golang-github-mattn-go-isatty-dev.
Preparing to unpack .../074-golang-github-mattn-go-isatty-dev_0.0.8-2_all.deb ...
Unpacking golang-github-mattn-go-isatty-dev (0.0.8-2) ...
Selecting previously unselected package golang-github-mattn-go-colorable-dev.
Preparing to unpack .../075-golang-github-mattn-go-colorable-dev_0.0.9-3_all.deb ...
Unpacking golang-github-mattn-go-colorable-dev (0.0.9-3) ...
Selecting previously unselected package golang-github-fatih-color-dev.
Preparing to unpack .../076-golang-github-fatih-color-dev_1.7.0-1_all.deb ...
Unpacking golang-github-fatih-color-dev (1.7.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-hclog-dev.
Preparing to unpack .../077-golang-github-hashicorp-go-hclog-dev_0.10.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-hclog-dev (0.10.1-1) ...
Selecting previously unselected package golang-github-hashicorp-go-retryablehttp-dev.
Preparing to unpack .../078-golang-github-hashicorp-go-retryablehttp-dev_0.6.4-1_all.deb ...
Unpacking golang-github-hashicorp-go-retryablehttp-dev (0.6.4-1) ...
Selecting previously unselected package golang-github-tv42-httpunix-dev.
Preparing to unpack .../079-golang-github-tv42-httpunix-dev_0.0~git20150427.b75d861-2_all.deb ...
Unpacking golang-github-tv42-httpunix-dev (0.0~git20150427.b75d861-2) ...
Selecting previously unselected package golang-github-circonus-labs-circonus-gometrics-dev.
Preparing to unpack .../080-golang-github-circonus-labs-circonus-gometrics-dev_2.3.1-2_all.deb ...
Unpacking golang-github-circonus-labs-circonus-gometrics-dev (2.3.1-2) ...
Selecting previously unselected package golang-github-datadog-datadog-go-dev.
Preparing to unpack .../081-golang-github-datadog-datadog-go-dev_2.1.0-2_all.deb ...
Unpacking golang-github-datadog-datadog-go-dev (2.1.0-2) ...
Selecting previously unselected package golang-github-hashicorp-go-uuid-dev.
Preparing to unpack .../082-golang-github-hashicorp-go-uuid-dev_1.0.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-uuid-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-hashicorp-golang-lru-dev.
Preparing to unpack .../083-golang-github-hashicorp-golang-lru-dev_0.5.3-1_all.deb ...
Unpacking golang-github-hashicorp-golang-lru-dev (0.5.3-1) ...
Selecting previously unselected package golang-github-hashicorp-go-immutable-radix-dev.
Preparing to unpack .../084-golang-github-hashicorp-go-immutable-radix-dev_1.1.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-immutable-radix-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-pascaldekloe-goe-dev.
Preparing to unpack .../085-golang-github-pascaldekloe-goe-dev_0.1.0-2_all.deb ...
Unpacking golang-github-pascaldekloe-goe-dev (0.1.0-2) ...
Selecting previously unselected package golang-github-beorn7-perks-dev.
Preparing to unpack .../086-golang-github-beorn7-perks-dev_0.0~git20160804.0.4c0e845-1_all.deb ...
Unpacking golang-github-beorn7-perks-dev (0.0~git20160804.0.4c0e845-1) ...
Selecting previously unselected package golang-github-cespare-xxhash-dev.
Preparing to unpack .../087-golang-github-cespare-xxhash-dev_2.1.1-1_all.deb ...
Unpacking golang-github-cespare-xxhash-dev (2.1.1-1) ...
Selecting previously unselected package golang-github-google-gofuzz-dev.
Preparing to unpack .../088-golang-github-google-gofuzz-dev_0.0~git20170612.24818f7-1_all.deb ...
Unpacking golang-github-google-gofuzz-dev (0.0~git20170612.24818f7-1) ...
Selecting previously unselected package golang-github-modern-go-concurrent-dev.
Preparing to unpack .../089-golang-github-modern-go-concurrent-dev_1.0.3-1_all.deb ...
Unpacking golang-github-modern-go-concurrent-dev (1.0.3-1) ...
Selecting previously unselected package golang-github-modern-go-reflect2-dev.
Preparing to unpack .../090-golang-github-modern-go-reflect2-dev_1.0.0-1_all.deb ...
Unpacking golang-github-modern-go-reflect2-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-json-iterator-go-dev.
Preparing to unpack .../091-golang-github-json-iterator-go-dev_1.1.4-1_all.deb ...
Unpacking golang-github-json-iterator-go-dev (1.1.4-1) ...
Selecting previously unselected package zlib1g-dev:armhf.
Preparing to unpack .../092-zlib1g-dev_1%3a1.2.11.dfsg-1_armhf.deb ...
Unpacking zlib1g-dev:armhf (1:1.2.11.dfsg-1) ...
Selecting previously unselected package libprotobuf-lite17:armhf.
Preparing to unpack .../093-libprotobuf-lite17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf-lite17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package libprotobuf-dev:armhf.
Preparing to unpack .../094-libprotobuf-dev_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf-dev:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package golang-goprotobuf-dev.
Preparing to unpack .../095-golang-goprotobuf-dev_1.3.2-2_armhf.deb ...
Unpacking golang-goprotobuf-dev (1.3.2-2) ...
Selecting previously unselected package golang-github-prometheus-client-model-dev.
Preparing to unpack .../096-golang-github-prometheus-client-model-dev_0.0.2+git20171117.99fa1f4-1_all.deb ...
Unpacking golang-github-prometheus-client-model-dev (0.0.2+git20171117.99fa1f4-1) ...
Selecting previously unselected package golang-github-dgrijalva-jwt-go-v3-dev.
Preparing to unpack .../097-golang-github-dgrijalva-jwt-go-v3-dev_3.2.0-2_all.deb ...
Unpacking golang-github-dgrijalva-jwt-go-v3-dev (3.2.0-2) ...
Selecting previously unselected package golang-github-go-logfmt-logfmt-dev.
Preparing to unpack .../098-golang-github-go-logfmt-logfmt-dev_0.3.0-1_all.deb ...
Unpacking golang-github-go-logfmt-logfmt-dev (0.3.0-1) ...
Selecting previously unselected package golang-github-go-stack-stack-dev.
Preparing to unpack .../099-golang-github-go-stack-stack-dev_1.5.2-2_all.deb ...
Unpacking golang-github-go-stack-stack-dev (1.5.2-2) ...
Selecting previously unselected package golang-golang-x-sync-dev.
Preparing to unpack .../100-golang-golang-x-sync-dev_0.0~git20190423.1122301-1_all.deb ...
Unpacking golang-golang-x-sync-dev (0.0~git20190423.1122301-1) ...
Selecting previously unselected package golang-golang-x-xerrors-dev.
Preparing to unpack .../101-golang-golang-x-xerrors-dev_0.0~git20190717.a985d34-1_all.deb ...
Unpacking golang-golang-x-xerrors-dev (0.0~git20190717.a985d34-1) ...
Selecting previously unselected package golang-golang-x-tools-dev.
Preparing to unpack .../102-golang-golang-x-tools-dev_1%3a0.0~git20191118.07fc4c7+ds-1_all.deb ...
Unpacking golang-golang-x-tools-dev (1:0.0~git20191118.07fc4c7+ds-1) ...
Selecting previously unselected package golang-golang-x-text-dev.
Preparing to unpack .../103-golang-golang-x-text-dev_0.3.2-1_all.deb ...
Unpacking golang-golang-x-text-dev (0.3.2-1) ...
Selecting previously unselected package golang-golang-x-net-dev.
Preparing to unpack .../104-golang-golang-x-net-dev_1%3a0.0+git20191112.2180aed+dfsg-1_all.deb ...
Unpacking golang-golang-x-net-dev (1:0.0+git20191112.2180aed+dfsg-1) ...
Selecting previously unselected package golang-github-opentracing-opentracing-go-dev.
Preparing to unpack .../105-golang-github-opentracing-opentracing-go-dev_1.0.2-1_all.deb ...
Unpacking golang-github-opentracing-opentracing-go-dev (1.0.2-1) ...
Selecting previously unselected package golang-golang-x-time-dev.
Preparing to unpack .../106-golang-golang-x-time-dev_0.0~git20161028.0.f51c127-2_all.deb ...
Unpacking golang-golang-x-time-dev (0.0~git20161028.0.f51c127-2) ...
Selecting previously unselected package golang-github-golang-mock-dev.
Preparing to unpack .../107-golang-github-golang-mock-dev_1.3.1-2_all.deb ...
Unpacking golang-github-golang-mock-dev (1.3.1-2) ...
Selecting previously unselected package golang-github-google-go-cmp-dev.
Preparing to unpack .../108-golang-github-google-go-cmp-dev_0.3.1-1_all.deb ...
Unpacking golang-github-google-go-cmp-dev (0.3.1-1) ...
Selecting previously unselected package golang-glog-dev.
Preparing to unpack .../109-golang-glog-dev_0.0~git20160126.23def4e-3_all.deb ...
Unpacking golang-glog-dev (0.0~git20160126.23def4e-3) ...
Selecting previously unselected package golang-golang-x-oauth2-dev.
Preparing to unpack .../110-golang-golang-x-oauth2-dev_0.0~git20190604.0f29369-2_all.deb ...
Unpacking golang-golang-x-oauth2-dev (0.0~git20190604.0f29369-2) ...
Selecting previously unselected package golang-google-cloud-compute-metadata-dev.
Preparing to unpack .../111-golang-google-cloud-compute-metadata-dev_0.43.0-1_all.deb ...
Unpacking golang-google-cloud-compute-metadata-dev (0.43.0-1) ...
Selecting previously unselected package golang-golang-x-oauth2-google-dev.
Preparing to unpack .../112-golang-golang-x-oauth2-google-dev_0.0~git20190604.0f29369-2_all.deb ...
Unpacking golang-golang-x-oauth2-google-dev (0.0~git20190604.0f29369-2) ...
Selecting previously unselected package golang-google-genproto-dev.
Preparing to unpack .../113-golang-google-genproto-dev_0.0~git20190801.fa694d8-2_all.deb ...
Unpacking golang-google-genproto-dev (0.0~git20190801.fa694d8-2) ...
Selecting previously unselected package golang-google-grpc-dev.
Preparing to unpack .../114-golang-google-grpc-dev_1.22.1-1_all.deb ...
Unpacking golang-google-grpc-dev (1.22.1-1) ...
Selecting previously unselected package golang-github-go-kit-kit-dev.
Preparing to unpack .../115-golang-github-go-kit-kit-dev_0.6.0-2_all.deb ...
Unpacking golang-github-go-kit-kit-dev (0.6.0-2) ...
Selecting previously unselected package golang-github-julienschmidt-httprouter-dev.
Preparing to unpack .../116-golang-github-julienschmidt-httprouter-dev_1.1-5_all.deb ...
Unpacking golang-github-julienschmidt-httprouter-dev (1.1-5) ...
Selecting previously unselected package golang-github-jpillora-backoff-dev.
Preparing to unpack .../117-golang-github-jpillora-backoff-dev_1.0.0-1_all.deb ...
Unpacking golang-github-jpillora-backoff-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mwitkow-go-conntrack-dev.
Preparing to unpack .../118-golang-github-mwitkow-go-conntrack-dev_0.0~git20190716.2f06839-1_all.deb ...
Unpacking golang-github-mwitkow-go-conntrack-dev (0.0~git20190716.2f06839-1) ...
Selecting previously unselected package golang-gopkg-alecthomas-kingpin.v2-dev.
Preparing to unpack .../119-golang-gopkg-alecthomas-kingpin.v2-dev_2.2.6-1_all.deb ...
Unpacking golang-gopkg-alecthomas-kingpin.v2-dev (2.2.6-1) ...
Selecting previously unselected package golang-protobuf-extensions-dev.
Preparing to unpack .../120-golang-protobuf-extensions-dev_1.0.1-1_all.deb ...
Unpacking golang-protobuf-extensions-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-prometheus-common-dev.
Preparing to unpack .../121-golang-github-prometheus-common-dev_0.7.0-1_all.deb ...
Unpacking golang-github-prometheus-common-dev (0.7.0-1) ...
Selecting previously unselected package golang-procfs-dev.
Preparing to unpack .../122-golang-procfs-dev_0.0.3-1_all.deb ...
Unpacking golang-procfs-dev (0.0.3-1) ...
Selecting previously unselected package golang-github-prometheus-client-golang-dev.
Preparing to unpack .../123-golang-github-prometheus-client-golang-dev_1.2.1-3_all.deb ...
Unpacking golang-github-prometheus-client-golang-dev (1.2.1-3) ...
Selecting previously unselected package golang-github-armon-go-metrics-dev.
Preparing to unpack .../124-golang-github-armon-go-metrics-dev_0.3.0-1_all.deb ...
Unpacking golang-github-armon-go-metrics-dev (0.3.0-1) ...
Selecting previously unselected package golang-github-armon-go-radix-dev.
Preparing to unpack .../125-golang-github-armon-go-radix-dev_1.0.0-1_all.deb ...
Unpacking golang-github-armon-go-radix-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-asaskevich-govalidator-dev.
Preparing to unpack .../126-golang-github-asaskevich-govalidator-dev_9+git20180720.0.f9ffefc3-1_all.deb ...
Unpacking golang-github-asaskevich-govalidator-dev (9+git20180720.0.f9ffefc3-1) ...
Selecting previously unselected package golang-github-go-ini-ini-dev.
Preparing to unpack .../127-golang-github-go-ini-ini-dev_1.32.0-2_all.deb ...
Unpacking golang-github-go-ini-ini-dev (1.32.0-2) ...
Selecting previously unselected package golang-github-jmespath-go-jmespath-dev.
Preparing to unpack .../128-golang-github-jmespath-go-jmespath-dev_0.2.2-3_all.deb ...
Unpacking golang-github-jmespath-go-jmespath-dev (0.2.2-3) ...
Selecting previously unselected package golang-github-aws-aws-sdk-go-dev.
Preparing to unpack .../129-golang-github-aws-aws-sdk-go-dev_1.21.6+dfsg-2_all.deb ...
Unpacking golang-github-aws-aws-sdk-go-dev (1.21.6+dfsg-2) ...
Selecting previously unselected package golang-github-dgrijalva-jwt-go-dev.
Preparing to unpack .../130-golang-github-dgrijalva-jwt-go-dev_3.2.0-1_all.deb ...
Unpacking golang-github-dgrijalva-jwt-go-dev (3.2.0-1) ...
Selecting previously unselected package golang-github-dimchansky-utfbom-dev.
Preparing to unpack .../131-golang-github-dimchansky-utfbom-dev_0.0~git20170328.6c6132f-1_all.deb ...
Unpacking golang-github-dimchansky-utfbom-dev (0.0~git20170328.6c6132f-1) ...
Selecting previously unselected package golang-github-mitchellh-go-homedir-dev.
Preparing to unpack .../132-golang-github-mitchellh-go-homedir-dev_1.1.0-1_all.deb ...
Unpacking golang-github-mitchellh-go-homedir-dev (1.1.0-1) ...
Selecting previously unselected package golang-golang-x-crypto-dev.
Preparing to unpack .../133-golang-golang-x-crypto-dev_1%3a0.0~git20190701.4def268-2_all.deb ...
Unpacking golang-golang-x-crypto-dev (1:0.0~git20190701.4def268-2) ...
Selecting previously unselected package golang-github-azure-go-autorest-dev.
Preparing to unpack .../134-golang-github-azure-go-autorest-dev_10.15.5-1_all.deb ...
Unpacking golang-github-azure-go-autorest-dev (10.15.5-1) ...
Selecting previously unselected package golang-github-bgentry-speakeasy-dev.
Preparing to unpack .../135-golang-github-bgentry-speakeasy-dev_0.1.0-1_all.deb ...
Unpacking golang-github-bgentry-speakeasy-dev (0.1.0-1) ...
Selecting previously unselected package golang-github-boltdb-bolt-dev.
Preparing to unpack .../136-golang-github-boltdb-bolt-dev_1.3.1-6_all.deb ...
Unpacking golang-github-boltdb-bolt-dev (1.3.1-6) ...
Selecting previously unselected package golang-github-bradfitz-gomemcache-dev.
Preparing to unpack .../137-golang-github-bradfitz-gomemcache-dev_0.0~git20141109-3_all.deb ...
Unpacking golang-github-bradfitz-gomemcache-dev (0.0~git20141109-3) ...
Selecting previously unselected package golang-github-coreos-pkg-dev.
Preparing to unpack .../138-golang-github-coreos-pkg-dev_4-2_all.deb ...
Unpacking golang-github-coreos-pkg-dev (4-2) ...
Selecting previously unselected package libsystemd-dev:armhf.
Preparing to unpack .../139-libsystemd-dev_244-3+rpi1+b1_armhf.deb ...
Unpacking libsystemd-dev:armhf (244-3+rpi1+b1) ...
Selecting previously unselected package pkg-config.
Preparing to unpack .../140-pkg-config_0.29-6_armhf.deb ...
Unpacking pkg-config (0.29-6) ...
Selecting previously unselected package golang-github-coreos-go-systemd-dev.
Preparing to unpack .../141-golang-github-coreos-go-systemd-dev_22.0.0-1_all.deb ...
Unpacking golang-github-coreos-go-systemd-dev (22.0.0-1) ...
Selecting previously unselected package golang-github-cyphar-filepath-securejoin-dev.
Preparing to unpack .../142-golang-github-cyphar-filepath-securejoin-dev_0.2.2-1_all.deb ...
Unpacking golang-github-cyphar-filepath-securejoin-dev (0.2.2-1) ...
Selecting previously unselected package golang-github-google-go-querystring-dev.
Preparing to unpack .../143-golang-github-google-go-querystring-dev_1.0.0-1_all.deb ...
Unpacking golang-github-google-go-querystring-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-tent-http-link-go-dev.
Preparing to unpack .../144-golang-github-tent-http-link-go-dev_0.0~git20130702.0.ac974c6-6_all.deb ...
Unpacking golang-github-tent-http-link-go-dev (0.0~git20130702.0.ac974c6-6) ...
Selecting previously unselected package golang-github-digitalocean-godo-dev.
Preparing to unpack .../145-golang-github-digitalocean-godo-dev_1.1.0-1_all.deb ...
Unpacking golang-github-digitalocean-godo-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-docker-go-units-dev.
Preparing to unpack .../146-golang-github-docker-go-units-dev_0.4.0-1_all.deb ...
Unpacking golang-github-docker-go-units-dev (0.4.0-1) ...
Selecting previously unselected package golang-github-opencontainers-selinux-dev.
Preparing to unpack .../147-golang-github-opencontainers-selinux-dev_1.3.0-2_all.deb ...
Unpacking golang-github-opencontainers-selinux-dev (1.3.0-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonpointer-dev.
Preparing to unpack .../148-golang-github-xeipuuv-gojsonpointer-dev_0.0~git20151027.0.e0fe6f6-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonpointer-dev (0.0~git20151027.0.e0fe6f6-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonreference-dev.
Preparing to unpack .../149-golang-github-xeipuuv-gojsonreference-dev_0.0~git20150808.0.e02fc20-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonreference-dev (0.0~git20150808.0.e02fc20-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonschema-dev.
Preparing to unpack .../150-golang-github-xeipuuv-gojsonschema-dev_0.0~git20170210.0.6b67b3f-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonschema-dev (0.0~git20170210.0.6b67b3f-2) ...
Selecting previously unselected package golang-github-opencontainers-specs-dev.
Preparing to unpack .../151-golang-github-opencontainers-specs-dev_1.0.1+git20190408.a1b50f6-1_all.deb ...
Unpacking golang-github-opencontainers-specs-dev (1.0.1+git20190408.a1b50f6-1) ...
Selecting previously unselected package libseccomp-dev:armhf.
Preparing to unpack .../152-libseccomp-dev_2.4.2-2+rpi1_armhf.deb ...
Unpacking libseccomp-dev:armhf (2.4.2-2+rpi1) ...
Selecting previously unselected package golang-github-seccomp-libseccomp-golang-dev.
Preparing to unpack .../153-golang-github-seccomp-libseccomp-golang-dev_0.9.1-1_all.deb ...
Unpacking golang-github-seccomp-libseccomp-golang-dev (0.9.1-1) ...
Selecting previously unselected package golang-github-urfave-cli-dev.
Preparing to unpack .../154-golang-github-urfave-cli-dev_1.20.0-1_all.deb ...
Unpacking golang-github-urfave-cli-dev (1.20.0-1) ...
Selecting previously unselected package golang-github-vishvananda-netns-dev.
Preparing to unpack .../155-golang-github-vishvananda-netns-dev_0.0~git20170707.0.86bef33-1_all.deb ...
Unpacking golang-github-vishvananda-netns-dev (0.0~git20170707.0.86bef33-1) ...
Selecting previously unselected package golang-github-vishvananda-netlink-dev.
Preparing to unpack .../156-golang-github-vishvananda-netlink-dev_1.0.0+git20181030.023a6da-1_all.deb ...
Unpacking golang-github-vishvananda-netlink-dev (1.0.0+git20181030.023a6da-1) ...
Selecting previously unselected package golang-gocapability-dev.
Preparing to unpack .../157-golang-gocapability-dev_0.0+git20180916.d983527-1_all.deb ...
Unpacking golang-gocapability-dev (0.0+git20180916.d983527-1) ...
Selecting previously unselected package golang-github-opencontainers-runc-dev.
Preparing to unpack .../158-golang-github-opencontainers-runc-dev_1.0.0~rc9+dfsg1-1+rpi1_all.deb ...
Unpacking golang-github-opencontainers-runc-dev (1.0.0~rc9+dfsg1-1+rpi1) ...
Selecting previously unselected package golang-github-docker-go-connections-dev.
Preparing to unpack .../159-golang-github-docker-go-connections-dev_0.4.0-1_all.deb ...
Unpacking golang-github-docker-go-connections-dev (0.4.0-1) ...
Selecting previously unselected package golang-github-elazarl-go-bindata-assetfs-dev.
Preparing to unpack .../160-golang-github-elazarl-go-bindata-assetfs-dev_1.0.0-1_all.deb ...
Unpacking golang-github-elazarl-go-bindata-assetfs-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-garyburd-redigo-dev.
Preparing to unpack .../161-golang-github-garyburd-redigo-dev_0.0~git20150901.0.d8dbe4d-2_all.deb ...
Unpacking golang-github-garyburd-redigo-dev (0.0~git20150901.0.d8dbe4d-2) ...
Selecting previously unselected package golang-github-ghodss-yaml-dev.
Preparing to unpack .../162-golang-github-ghodss-yaml-dev_1.0.0-1_all.deb ...
Unpacking golang-github-ghodss-yaml-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-go-test-deep-dev.
Preparing to unpack .../163-golang-github-go-test-deep-dev_1.0.3-1_all.deb ...
Unpacking golang-github-go-test-deep-dev (1.0.3-1) ...
Selecting previously unselected package golang-gogoprotobuf-dev.
Preparing to unpack .../164-golang-gogoprotobuf-dev_1.2.1+git20190611.dadb6258-1_all.deb ...
Unpacking golang-gogoprotobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package golang-github-gogo-googleapis-dev.
Preparing to unpack .../165-golang-github-gogo-googleapis-dev_1.2.0-1_all.deb ...
Unpacking golang-github-gogo-googleapis-dev (1.2.0-1) ...
Selecting previously unselected package golang-github-golang-snappy-dev.
Preparing to unpack .../166-golang-github-golang-snappy-dev_0.0+git20160529.d9eb7a3-3_all.deb ...
Unpacking golang-github-golang-snappy-dev (0.0+git20160529.d9eb7a3-3) ...
Selecting previously unselected package golang-github-google-btree-dev.
Preparing to unpack .../167-golang-github-google-btree-dev_1.0.0-1_all.deb ...
Unpacking golang-github-google-btree-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-docopt-docopt-go-dev.
Preparing to unpack .../168-golang-github-docopt-docopt-go-dev_0.6.2+git20160216.0.784ddc5-1_all.deb ...
Unpacking golang-github-docopt-docopt-go-dev (0.6.2+git20160216.0.784ddc5-1) ...
Selecting previously unselected package golang-github-googleapis-gnostic-dev.
Preparing to unpack .../169-golang-github-googleapis-gnostic-dev_0.2.0-1_all.deb ...
Unpacking golang-github-googleapis-gnostic-dev (0.2.0-1) ...
Selecting previously unselected package golang-github-peterbourgon-diskv-dev.
Preparing to unpack .../170-golang-github-peterbourgon-diskv-dev_3.0.0-1_all.deb ...
Unpacking golang-github-peterbourgon-diskv-dev (3.0.0-1) ...
Selecting previously unselected package golang-gomega-dev.
Preparing to unpack .../171-golang-gomega-dev_1.0+git20160910.d59fa0a-1_all.deb ...
Unpacking golang-gomega-dev (1.0+git20160910.d59fa0a-1) ...
Selecting previously unselected package golang-ginkgo-dev.
Preparing to unpack .../172-golang-ginkgo-dev_1.2.0+git20161006.acfa16a-1_armhf.deb ...
Unpacking golang-ginkgo-dev (1.2.0+git20161006.acfa16a-1) ...
Selecting previously unselected package golang-github-syndtr-goleveldb-dev.
Preparing to unpack .../173-golang-github-syndtr-goleveldb-dev_0.0~git20170725.0.b89cc31-2_all.deb ...
Unpacking golang-github-syndtr-goleveldb-dev (0.0~git20170725.0.b89cc31-2) ...
Selecting previously unselected package golang-github-gregjones-httpcache-dev.
Preparing to unpack .../174-golang-github-gregjones-httpcache-dev_0.0~git20180305.9cad4c3-1_all.deb ...
Unpacking golang-github-gregjones-httpcache-dev (0.0~git20180305.9cad4c3-1) ...
Selecting previously unselected package golang-github-hashicorp-errwrap-dev.
Preparing to unpack .../175-golang-github-hashicorp-errwrap-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-errwrap-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-checkpoint-dev.
Preparing to unpack .../176-golang-github-hashicorp-go-checkpoint-dev_0.0~git20171009.1545e56-2_all.deb ...
Unpacking golang-github-hashicorp-go-checkpoint-dev (0.0~git20171009.1545e56-2) ...
Selecting previously unselected package golang-github-denverdino-aliyungo-dev.
Preparing to unpack .../177-golang-github-denverdino-aliyungo-dev_0.0~git20180921.13fa8aa-2_all.deb ...
Unpacking golang-github-denverdino-aliyungo-dev (0.0~git20180921.13fa8aa-2) ...
Selecting previously unselected package golang-github-gophercloud-gophercloud-dev.
Preparing to unpack .../178-golang-github-gophercloud-gophercloud-dev_0.6.0-1_all.deb ...
Unpacking golang-github-gophercloud-gophercloud-dev (0.6.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-multierror-dev.
Preparing to unpack .../179-golang-github-hashicorp-go-multierror-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-multierror-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-miekg-dns-dev.
Preparing to unpack .../180-golang-github-miekg-dns-dev_1.0.4+ds-1_all.deb ...
Unpacking golang-github-miekg-dns-dev (1.0.4+ds-1) ...
Selecting previously unselected package golang-github-hashicorp-mdns-dev.
Preparing to unpack .../181-golang-github-hashicorp-mdns-dev_1.0.1-1_all.deb ...
Unpacking golang-github-hashicorp-mdns-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-packethost-packngo-dev.
Preparing to unpack .../182-golang-github-packethost-packngo-dev_0.2.0-2_all.deb ...
Unpacking golang-github-packethost-packngo-dev (0.2.0-2) ...
Selecting previously unselected package golang-github-vmware-govmomi-dev.
Preparing to unpack .../183-golang-github-vmware-govmomi-dev_0.15.0-1_all.deb ...
Unpacking golang-github-vmware-govmomi-dev (0.15.0-1) ...
Selecting previously unselected package golang-go.opencensus-dev.
Preparing to unpack .../184-golang-go.opencensus-dev_0.22.0-1_all.deb ...
Unpacking golang-go.opencensus-dev (0.22.0-1) ...
Selecting previously unselected package golang-google-api-dev.
Preparing to unpack .../185-golang-google-api-dev_0.7.0-2_all.deb ...
Unpacking golang-google-api-dev (0.7.0-2) ...
Selecting previously unselected package golang-github-hashicorp-go-discover-dev.
Preparing to unpack .../186-golang-github-hashicorp-go-discover-dev_0.0+git20190905.34a6505-2_all.deb ...
Unpacking golang-github-hashicorp-go-discover-dev (0.0+git20190905.34a6505-2) ...
Selecting previously unselected package golang-github-hashicorp-go-memdb-dev.
Preparing to unpack .../187-golang-github-hashicorp-go-memdb-dev_0.0~git20180224.1289e7ff-1_all.deb ...
Unpacking golang-github-hashicorp-go-memdb-dev (0.0~git20180224.1289e7ff-1) ...
Selecting previously unselected package golang-github-ugorji-go-codec-dev.
Preparing to unpack .../188-golang-github-ugorji-go-codec-dev_1.1.7-1_all.deb ...
Unpacking golang-github-ugorji-go-codec-dev (1.1.7-1) ...
Selecting previously unselected package golang-github-ugorji-go-msgpack-dev.
Preparing to unpack .../189-golang-github-ugorji-go-msgpack-dev_0.0~git20130605.792643-5_all.deb ...
Unpacking golang-github-ugorji-go-msgpack-dev (0.0~git20130605.792643-5) ...
Selecting previously unselected package golang-github-vmihailenco-tagparser-dev.
Preparing to unpack .../190-golang-github-vmihailenco-tagparser-dev_0.1.1-2_all.deb ...
Unpacking golang-github-vmihailenco-tagparser-dev (0.1.1-2) ...
Selecting previously unselected package golang-gopkg-vmihailenco-msgpack.v2-dev.
Preparing to unpack .../191-golang-gopkg-vmihailenco-msgpack.v2-dev_4.2.2-1_all.deb ...
Unpacking golang-gopkg-vmihailenco-msgpack.v2-dev (4.2.2-1) ...
Selecting previously unselected package golang-gopkg-tomb.v2-dev.
Preparing to unpack .../192-golang-gopkg-tomb.v2-dev_0.0~git20161208.d5d1b58-3_all.deb ...
Unpacking golang-gopkg-tomb.v2-dev (0.0~git20161208.d5d1b58-3) ...
Selecting previously unselected package libsasl2-dev.
Preparing to unpack .../193-libsasl2-dev_2.1.27+dfsg-1+b1_armhf.deb ...
Unpacking libsasl2-dev (2.1.27+dfsg-1+b1) ...
Selecting previously unselected package golang-gopkg-mgo.v2-dev.
Preparing to unpack .../194-golang-gopkg-mgo.v2-dev_2016.08.01-6_all.deb ...
Unpacking golang-gopkg-mgo.v2-dev (2016.08.01-6) ...
Selecting previously unselected package golang-github-hashicorp-go-msgpack-dev.
Preparing to unpack .../195-golang-github-hashicorp-go-msgpack-dev_0.5.5-1_all.deb ...
Unpacking golang-github-hashicorp-go-msgpack-dev (0.5.5-1) ...
Selecting previously unselected package golang-github-hashicorp-raft-dev.
Preparing to unpack .../196-golang-github-hashicorp-raft-dev_1.1.1-5_all.deb ...
Unpacking golang-github-hashicorp-raft-dev (1.1.1-5) ...
Selecting previously unselected package libjs-jquery.
Preparing to unpack .../197-libjs-jquery_3.3.1~dfsg-3_all.deb ...
Unpacking libjs-jquery (3.3.1~dfsg-3) ...
Selecting previously unselected package libjs-jquery-ui.
Preparing to unpack .../198-libjs-jquery-ui_1.12.1+dfsg-5_all.deb ...
Unpacking libjs-jquery-ui (1.12.1+dfsg-5) ...
Selecting previously unselected package golang-golang-x-tools.
Preparing to unpack .../199-golang-golang-x-tools_1%3a0.0~git20191118.07fc4c7+ds-1_armhf.deb ...
Unpacking golang-golang-x-tools (1:0.0~git20191118.07fc4c7+ds-1) ...
Selecting previously unselected package golang-github-mitchellh-reflectwalk-dev.
Preparing to unpack .../200-golang-github-mitchellh-reflectwalk-dev_0.0~git20170726.63d60e9-4_all.deb ...
Unpacking golang-github-mitchellh-reflectwalk-dev (0.0~git20170726.63d60e9-4) ...
Selecting previously unselected package golang-github-mitchellh-copystructure-dev.
Preparing to unpack .../201-golang-github-mitchellh-copystructure-dev_0.0~git20161013.0.5af94ae-2_all.deb ...
Unpacking golang-github-mitchellh-copystructure-dev (0.0~git20161013.0.5af94ae-2) ...
Selecting previously unselected package golang-github-hashicorp-go-raftchunking-dev.
Preparing to unpack .../202-golang-github-hashicorp-go-raftchunking-dev_0.6.2-2_all.deb ...
Unpacking golang-github-hashicorp-go-raftchunking-dev (0.6.2-2) ...
Selecting previously unselected package golang-github-hashicorp-go-reap-dev.
Preparing to unpack .../203-golang-github-hashicorp-go-reap-dev_0.0~git20160113.0.2d85522-3_all.deb ...
Unpacking golang-github-hashicorp-go-reap-dev (0.0~git20160113.0.2d85522-3) ...
Selecting previously unselected package golang-github-hashicorp-go-sockaddr-dev.
Preparing to unpack .../204-golang-github-hashicorp-go-sockaddr-dev_0.0~git20170627.41949a1+ds-2_all.deb ...
Unpacking golang-github-hashicorp-go-sockaddr-dev (0.0~git20170627.41949a1+ds-2) ...
Selecting previously unselected package golang-github-hashicorp-go-version-dev.
Preparing to unpack .../205-golang-github-hashicorp-go-version-dev_1.2.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-version-dev (1.2.0-1) ...
Selecting previously unselected package golang-github-hashicorp-hcl-dev.
Preparing to unpack .../206-golang-github-hashicorp-hcl-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-hcl-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mitchellh-mapstructure-dev.
Preparing to unpack .../207-golang-github-mitchellh-mapstructure-dev_1.1.2-1_all.deb ...
Unpacking golang-github-mitchellh-mapstructure-dev (1.1.2-1) ...
Selecting previously unselected package golang-github-hashicorp-hil-dev.
Preparing to unpack .../208-golang-github-hashicorp-hil-dev_0.0~git20160711.1e86c6b-1_all.deb ...
Unpacking golang-github-hashicorp-hil-dev (0.0~git20160711.1e86c6b-1) ...
Selecting previously unselected package golang-github-hashicorp-memberlist-dev.
Preparing to unpack .../209-golang-github-hashicorp-memberlist-dev_0.1.5-2_all.deb ...
Unpacking golang-github-hashicorp-memberlist-dev (0.1.5-2) ...
Selecting previously unselected package golang-github-hashicorp-raft-boltdb-dev.
Preparing to unpack .../210-golang-github-hashicorp-raft-boltdb-dev_0.0~git20171010.6e5ba93-3_all.deb ...
Unpacking golang-github-hashicorp-raft-boltdb-dev (0.0~git20171010.6e5ba93-3) ...
Selecting previously unselected package golang-github-hashicorp-net-rpc-msgpackrpc-dev.
Preparing to unpack .../211-golang-github-hashicorp-net-rpc-msgpackrpc-dev_0.0~git20151116.0.a14192a-1_all.deb ...
Unpacking golang-github-hashicorp-net-rpc-msgpackrpc-dev (0.0~git20151116.0.a14192a-1) ...
Selecting previously unselected package golang-github-hashicorp-yamux-dev.
Preparing to unpack .../212-golang-github-hashicorp-yamux-dev_0.0+git20190923.df201c7-1_all.deb ...
Unpacking golang-github-hashicorp-yamux-dev (0.0+git20190923.df201c7-1) ...
Selecting previously unselected package golang-github-hashicorp-scada-client-dev.
Preparing to unpack .../213-golang-github-hashicorp-scada-client-dev_0.0~git20160601.0.6e89678-2_all.deb ...
Unpacking golang-github-hashicorp-scada-client-dev (0.0~git20160601.0.6e89678-2) ...
Selecting previously unselected package golang-github-hashicorp-go-syslog-dev.
Preparing to unpack .../214-golang-github-hashicorp-go-syslog-dev_0.0~git20150218.0.42a2b57-1_all.deb ...
Unpacking golang-github-hashicorp-go-syslog-dev (0.0~git20150218.0.42a2b57-1) ...
Selecting previously unselected package golang-github-hashicorp-logutils-dev.
Preparing to unpack .../215-golang-github-hashicorp-logutils-dev_0.0~git20150609.0.0dc08b1-1_all.deb ...
Unpacking golang-github-hashicorp-logutils-dev (0.0~git20150609.0.0dc08b1-1) ...
Selecting previously unselected package golang-github-posener-complete-dev.
Preparing to unpack .../216-golang-github-posener-complete-dev_1.1+git20180108.57878c9-3_all.deb ...
Unpacking golang-github-posener-complete-dev (1.1+git20180108.57878c9-3) ...
Selecting previously unselected package golang-github-mitchellh-cli-dev.
Preparing to unpack .../217-golang-github-mitchellh-cli-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-cli-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-ryanuber-columnize-dev.
Preparing to unpack .../218-golang-github-ryanuber-columnize-dev_2.1.1-1_all.deb ...
Unpacking golang-github-ryanuber-columnize-dev (2.1.1-1) ...
Selecting previously unselected package golang-github-hashicorp-serf-dev.
Preparing to unpack .../219-golang-github-hashicorp-serf-dev_0.8.5~ds1-1_all.deb ...
Unpacking golang-github-hashicorp-serf-dev (0.8.5~ds1-1) ...
Selecting previously unselected package golang-github-imdario-mergo-dev.
Preparing to unpack .../220-golang-github-imdario-mergo-dev_0.3.5-1_all.deb ...
Unpacking golang-github-imdario-mergo-dev (0.3.5-1) ...
Selecting previously unselected package golang-github-inconshreveable-muxado-dev.
Preparing to unpack .../221-golang-github-inconshreveable-muxado-dev_0.0~git20140312.0.f693c7e-2_all.deb ...
Unpacking golang-github-inconshreveable-muxado-dev (0.0~git20140312.0.f693c7e-2) ...
Selecting previously unselected package golang-github-jeffail-gabs-dev.
Preparing to unpack .../222-golang-github-jeffail-gabs-dev_2.3.0-1_all.deb ...
Unpacking golang-github-jeffail-gabs-dev (2.3.0-1) ...
Selecting previously unselected package golang-github-jefferai-jsonx-dev.
Preparing to unpack .../223-golang-github-jefferai-jsonx-dev_1.0.1-2_all.deb ...
Unpacking golang-github-jefferai-jsonx-dev (1.0.1-2) ...
Selecting previously unselected package golang-github-mitchellh-go-testing-interface-dev.
Preparing to unpack .../224-golang-github-mitchellh-go-testing-interface-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-go-testing-interface-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mitchellh-hashstructure-dev.
Preparing to unpack .../225-golang-github-mitchellh-hashstructure-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-hashstructure-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-nytimes-gziphandler-dev.
Preparing to unpack .../226-golang-github-nytimes-gziphandler-dev_1.1.1-1_all.deb ...
Unpacking golang-github-nytimes-gziphandler-dev (1.1.1-1) ...
Selecting previously unselected package golang-github-ryanuber-go-glob-dev.
Preparing to unpack .../227-golang-github-ryanuber-go-glob-dev_1.0.0-2_all.deb ...
Unpacking golang-github-ryanuber-go-glob-dev (1.0.0-2) ...
Selecting previously unselected package golang-github-sap-go-hdb-dev.
Preparing to unpack .../228-golang-github-sap-go-hdb-dev_0.14.1-2_all.deb ...
Unpacking golang-github-sap-go-hdb-dev (0.14.1-2) ...
Selecting previously unselected package golang-github-shirou-gopsutil-dev.
Preparing to unpack .../229-golang-github-shirou-gopsutil-dev_2.18.06-1_all.deb ...
Unpacking golang-github-shirou-gopsutil-dev (2.18.06-1) ...
Selecting previously unselected package golang-github-spf13-pflag-dev.
Preparing to unpack .../230-golang-github-spf13-pflag-dev_1.0.3-1_all.deb ...
Unpacking golang-github-spf13-pflag-dev (1.0.3-1) ...
Selecting previously unselected package golang-gopkg-inf.v0-dev.
Preparing to unpack .../231-golang-gopkg-inf.v0-dev_0.9.0-3_all.deb ...
Unpacking golang-gopkg-inf.v0-dev (0.9.0-3) ...
Selecting previously unselected package golang-gopkg-square-go-jose.v2-dev.
Preparing to unpack .../232-golang-gopkg-square-go-jose.v2-dev_2.4.1-1_all.deb ...
Unpacking golang-gopkg-square-go-jose.v2-dev (2.4.1-1) ...
Selecting previously unselected package mockery.
Preparing to unpack .../233-mockery_0.0~git20181123.e78b021-2_armhf.deb ...
Unpacking mockery (0.0~git20181123.e78b021-2) ...
Selecting previously unselected package golang-github-hashicorp-go-rootcerts-dev.
Preparing to unpack .../234-golang-github-hashicorp-go-rootcerts-dev_0.0~git20160503.0.6bb64b3-1_all.deb ...
Unpacking golang-github-hashicorp-go-rootcerts-dev (0.0~git20160503.0.6bb64b3-1) ...
Selecting previously unselected package sbuild-build-depends-consul-dummy.
Preparing to unpack .../235-sbuild-build-depends-consul-dummy_0.invalid.0_armhf.deb ...
Unpacking sbuild-build-depends-consul-dummy (0.invalid.0) ...
Setting up golang-github-xeipuuv-gojsonpointer-dev (0.0~git20151027.0.e0fe6f6-2) ...
Setting up golang-github-dimchansky-utfbom-dev (0.0~git20170328.6c6132f-1) ...
Setting up golang-github-dgrijalva-jwt-go-v3-dev (3.2.0-2) ...
Setting up libpipeline1:armhf (1.5.1-3) ...
Setting up golang-github-google-go-cmp-dev (0.3.1-1) ...
Setting up golang-github-ryanuber-go-glob-dev (1.0.0-2) ...
Setting up golang-github-go-ini-ini-dev (1.32.0-2) ...
Setting up golang-github-hashicorp-go-uuid-dev (1.0.1-1) ...
Setting up golang-1.13-src (1.13.5-1+rpi1) ...
Setting up libseccomp-dev:armhf (2.4.2-2+rpi1) ...
Setting up golang-github-mitchellh-go-homedir-dev (1.1.0-1) ...
Setting up golang-github-google-go-querystring-dev (1.0.0-1) ...
Setting up golang-github-mitchellh-mapstructure-dev (1.1.2-1) ...
Setting up golang-dbus-dev (5.0.3-1) ...
Setting up golang-github-gogo-protobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Setting up golang-github-golang-mock-dev (1.3.1-2) ...
Setting up golang-github-stretchr-objx-dev (0.1.1+git20180825.ef50b0d-1) ...
Setting up golang-github-mitchellh-hashstructure-dev (1.0.0-1) ...
Setting up libmagic-mgc (1:5.37-6) ...
Setting up golang-github-pkg-errors-dev (0.8.1-1) ...
Setting up golang-github-hashicorp-golang-lru-dev (0.5.3-1) ...
Setting up golang-github-google-gofuzz-dev (0.0~git20170612.24818f7-1) ...
Setting up golang-github-inconshreveable-muxado-dev (0.0~git20140312.0.f693c7e-2) ...
Setting up libarchive-zip-perl (1.67-1) ...
Setting up libglib2.0-0:armhf (2.62.3-2) ...
No schema files found: doing nothing.
Setting up libprotobuf-lite17:armhf (3.6.1.3-2+rpi1) ...
Setting up libssl1.1:armhf (1.1.1d-2) ...
Setting up golang-github-ryanuber-columnize-dev (2.1.1-1) ...
Setting up libprocps7:armhf (2:3.3.15-2) ...
Setting up libdebhelper-perl (12.7.2) ...
Setting up golang-golang-x-sys-dev (0.0~git20190726.fc99dfb-1) ...
Setting up golang-github-tent-http-link-go-dev (0.0~git20130702.0.ac974c6-6) ...
Setting up libmagic1:armhf (1:5.37-6) ...
Setting up golang-github-hashicorp-go-syslog-dev (0.0~git20150218.0.42a2b57-1) ...
Setting up golang-github-golang-snappy-dev (0.0+git20160529.d9eb7a3-3) ...
Setting up golang-github-pmezard-go-difflib-dev (1.0.0-2) ...
Setting up golang-github-modern-go-concurrent-dev (1.0.3-1) ...
Setting up gettext-base (0.19.8.1-10) ...
Setting up golang-github-circonus-labs-circonusllhist-dev (0.0~git20160526.0.d724266-2) ...
Setting up golang-github-bradfitz-gomemcache-dev (0.0~git20141109-3) ...
Setting up mockery (0.0~git20181123.e78b021-2) ...
Setting up golang-github-mitchellh-go-testing-interface-dev (1.0.0-1) ...
Setting up file (1:5.37-6) ...
Setting up golang-github-seccomp-libseccomp-golang-dev (0.9.1-1) ...
Setting up golang-github-asaskevich-govalidator-dev (9+git20180720.0.f9ffefc3-1) ...
Setting up golang-github-google-btree-dev (1.0.0-1) ...
Setting up golang-github-go-stack-stack-dev (1.5.2-2) ...
Setting up golang-github-beorn7-perks-dev (0.0~git20160804.0.4c0e845-1) ...
Setting up libicu63:armhf (63.2-2) ...
Setting up golang-github-hashicorp-go-cleanhttp-dev (0.5.1-1) ...
Setting up golang-github-hashicorp-errwrap-dev (1.0.0-1) ...
Setting up golang-github-cespare-xxhash-dev (2.1.1-1) ...
Setting up golang-github-spf13-pflag-dev (1.0.3-1) ...
Setting up golang-gopkg-tomb.v2-dev (0.0~git20161208.d5d1b58-3) ...
Setting up golang-github-bgentry-speakeasy-dev (0.1.0-1) ...
Setting up golang-github-jpillora-backoff-dev (1.0.0-1) ...
Setting up golang-github-davecgh-go-spew-dev (1.1.1-2) ...
Setting up autotools-dev (20180224.1) ...
Setting up libsasl2-dev (2.1.27+dfsg-1+b1) ...
Setting up golang-github-pascaldekloe-goe-dev (0.1.0-2) ...
Setting up golang-github-go-logfmt-logfmt-dev (0.3.0-1) ...
Setting up golang-github-ugorji-go-msgpack-dev (0.0~git20130605.792643-5) ...
Setting up golang-github-go-test-deep-dev (1.0.3-1) ...
Setting up bash-completion (1:2.8-6) ...
Setting up golang-github-hashicorp-go-immutable-radix-dev (1.1.0-1) ...
Setting up golang-github-boltdb-bolt-dev (1.3.1-6) ...
Setting up libncurses6:armhf (6.1+20191019-1) ...
Setting up libsigsegv2:armhf (2.12-2) ...
Setting up golang-github-xeipuuv-gojsonreference-dev (0.0~git20150808.0.e02fc20-2) ...
Setting up libmnl0:armhf (1.0.4-2) ...
Setting up golang-golang-x-sync-dev (0.0~git20190423.1122301-1) ...
Setting up autopoint (0.19.8.1-10) ...
Setting up golang-github-kr-pty-dev (1.1.6-1) ...
Setting up golang-github-opencontainers-selinux-dev (1.3.0-2) ...
Setting up pkg-config (0.29-6) ...
Setting up golang-github-hashicorp-hcl-dev (1.0.0-1) ...
Setting up golang-github-vishvananda-netns-dev (0.0~git20170707.0.86bef33-1) ...
Setting up golang-1.13-go (1.13.5-1+rpi1) ...
Setting up libxtables12:armhf (1.8.3-2) ...
Setting up golang-gocapability-dev (0.0+git20180916.d983527-1) ...
Setting up golang-glog-dev (0.0~git20160126.23def4e-3) ...
Setting up golang-github-julienschmidt-httprouter-dev (1.1-5) ...
Setting up golang-github-hashicorp-go-multierror-dev (1.0.0-1) ...
Setting up lsof (4.93.2+dfsg-1) ...
Setting up zlib1g-dev:armhf (1:1.2.11.dfsg-1) ...
Setting up golang-github-tv42-httpunix-dev (0.0~git20150427.b75d861-2) ...
Setting up golang-github-hashicorp-go-version-dev (1.2.0-1) ...
Setting up golang-gopkg-inf.v0-dev (0.9.0-3) ...
Setting up sensible-utils (0.0.12+nmu1) ...
Setting up libuchardet0:armhf (0.0.6-3) ...
Setting up golang-github-vishvananda-netlink-dev (1.0.0+git20181030.023a6da-1) ...
Setting up procps (2:3.3.15-2) ...
update-alternatives: using /usr/bin/w.procps to provide /usr/bin/w (w) in auto mode
Setting up golang-github-cyphar-filepath-securejoin-dev (0.2.2-1) ...
Setting up golang-github-modern-go-reflect2-dev (1.0.0-1) ...
Setting up libsub-override-perl (0.09-2) ...
Setting up golang-github-dgrijalva-jwt-go-dev (3.2.0-1) ...
Setting up golang-github-armon-go-radix-dev (1.0.0-1) ...
Setting up libprotobuf17:armhf (3.6.1.3-2+rpi1) ...
Setting up golang-github-datadog-datadog-go-dev (2.1.0-2) ...
Setting up libjs-jquery (3.3.1~dfsg-3) ...
Setting up golang-golang-x-xerrors-dev (0.0~git20190717.a985d34-1) ...
Setting up golang-procfs-dev (0.0.3-1) ...
Setting up golang-src (2:1.13~1+b14) ...
Setting up openssl (1.1.1d-2) ...
Setting up libbsd0:armhf (0.10.0-1) ...
Setting up libtinfo5:armhf (6.1+20191019-1) ...
Setting up libelf1:armhf (0.176-1.1) ...
Setting up golang-github-vmihailenco-tagparser-dev (0.1.1-2) ...
Setting up golang-github-armon-circbuf-dev (0.0~git20150827.0.bbbad09-2) ...
Setting up golang-github-jeffail-gabs-dev (2.3.0-1) ...
Setting up libxml2:armhf (2.9.4+dfsg1-8) ...
Setting up golang-github-jefferai-jsonx-dev (1.0.1-2) ...
Setting up libsystemd-dev:armhf (244-3+rpi1+b1) ...
Setting up golang-github-hashicorp-yamux-dev (0.0+git20190923.df201c7-1) ...
Setting up golang-github-hashicorp-go-rootcerts-dev (0.0~git20160503.0.6bb64b3-1) ...
Setting up golang-github-hashicorp-logutils-dev (0.0~git20150609.0.0dc08b1-1) ...
Setting up libfile-stripnondeterminism-perl (1.6.3-1) ...
Setting up golang-github-mattn-go-isatty-dev (0.0.8-2) ...
Setting up golang-github-hashicorp-go-reap-dev (0.0~git20160113.0.2d85522-3) ...
Setting up golang-github-digitalocean-godo-dev (1.1.0-1) ...
Setting up golang-github-hashicorp-go-memdb-dev (0.0~git20180224.1289e7ff-1) ...
Setting up libprotoc17:armhf (3.6.1.3-2+rpi1) ...
Setting up protobuf-compiler (3.6.1.3-2+rpi1) ...
Setting up libtool (2.4.6-11) ...
Setting up golang-go (2:1.13~1+b14) ...
Setting up golang-github-mattn-go-colorable-dev (0.0.9-3) ...
Setting up iproute2 (5.4.0-1) ...
Setting up golang-github-posener-complete-dev (1.1+git20180108.57878c9-3) ...
Setting up golang-github-docker-go-units-dev (0.4.0-1) ...
Setting up m4 (1.4.18-4) ...
Setting up golang-any (2:1.13~1+b14) ...
Setting up libprotobuf-dev:armhf (3.6.1.3-2+rpi1) ...
Setting up ca-certificates (20190110) ...
Updating certificates in /etc/ssl/certs...
128 added, 0 removed; done.
Setting up golang-goprotobuf-dev (1.3.2-2) ...
Setting up libjs-jquery-ui (1.12.1+dfsg-5) ...
Setting up golang-github-kr-text-dev (0.1.0-1) ...
Setting up golang-github-elazarl-go-bindata-assetfs-dev (1.0.0-1) ...
Setting up bsdmainutils (11.1.2) ...
update-alternatives: using /usr/bin/bsd-write to provide /usr/bin/write (write) in auto mode
update-alternatives: using /usr/bin/bsd-from to provide /usr/bin/from (from) in auto mode
Setting up libcroco3:armhf (0.6.13-1) ...
Setting up gogoprotobuf (1.2.1+git20190611.dadb6258-1) ...
Setting up autoconf (2.69-11) ...
Setting up dh-strip-nondeterminism (1.6.3-1) ...
Setting up dwz (0.13-5) ...
Setting up groff-base (1.22.4-4) ...
Setting up golang-github-prometheus-client-model-dev (0.0.2+git20171117.99fa1f4-1) ...
Setting up golang-github-docopt-docopt-go-dev (0.6.2+git20160216.0.784ddc5-1) ...
Setting up golang-github-hashicorp-go-checkpoint-dev (0.0~git20171009.1545e56-2) ...
Setting up automake (1:1.16.1-4) ...
update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode
Setting up golang-github-kr-pretty-dev (0.1.0-1) ...
Setting up gettext (0.19.8.1-10) ...
Setting up golang-github-peterbourgon-diskv-dev (3.0.0-1) ...
Setting up golang-github-fatih-color-dev (1.7.0-1) ...
Setting up golang-github-hashicorp-go-sockaddr-dev (0.0~git20170627.41949a1+ds-2) ...
Setting up golang-github-garyburd-redigo-dev (0.0~git20150901.0.d8dbe4d-2) ...
Setting up golang-protobuf-extensions-dev (1.0.1-1) ...
Setting up golang-gogoprotobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Setting up golang-gopkg-check.v1-dev (0.0+git20180628.788fd78-1) ...
Setting up man-db (2.9.0-2) ...
Not building database; man-db/auto-update is not 'true'.
Setting up golang-golang-x-tools (1:0.0~git20191118.07fc4c7+ds-1) ...
Setting up golang-github-mitchellh-reflectwalk-dev (0.0~git20170726.63d60e9-4) ...
Setting up golang-github-denverdino-aliyungo-dev (0.0~git20180921.13fa8aa-2) ...
Setting up intltool-debian (0.35.0+20060710.5) ...
Setting up golang-gopkg-mgo.v2-dev (2016.08.01-6) ...
Setting up golang-github-mitchellh-cli-dev (1.0.0-1) ...
Setting up golang-github-hashicorp-hil-dev (0.0~git20160711.1e86c6b-1) ...
Setting up golang-github-gogo-googleapis-dev (1.2.0-1) ...
Setting up golang-gopkg-yaml.v2-dev (2.2.2-1) ...
Setting up golang-github-imdario-mergo-dev (0.3.5-1) ...
Setting up po-debconf (1.0.21) ...
Setting up golang-gomega-dev (1.0+git20160910.d59fa0a-1) ...
Setting up golang-github-mitchellh-copystructure-dev (0.0~git20161013.0.5af94ae-2) ...
Setting up golang-github-stretchr-testify-dev (1.4.0+ds-1) ...
Setting up golang-github-shirou-gopsutil-dev (2.18.06-1) ...
Setting up golang-github-alecthomas-units-dev (0.0~git20151022.0.2efee85-4) ...
Setting up golang-github-ghodss-yaml-dev (1.0.0-1) ...
Setting up golang-github-jmespath-go-jmespath-dev (0.2.2-3) ...
Setting up golang-github-hashicorp-go-hclog-dev (0.10.1-1) ...
Setting up golang-github-urfave-cli-dev (1.20.0-1) ...
Setting up golang-github-sirupsen-logrus-dev (1.4.2-1) ...
Setting up golang-ginkgo-dev (1.2.0+git20161006.acfa16a-1) ...
Setting up golang-gopkg-alecthomas-kingpin.v2-dev (2.2.6-1) ...
Setting up golang-github-xeipuuv-gojsonschema-dev (0.0~git20170210.0.6b67b3f-2) ...
Setting up golang-github-nytimes-gziphandler-dev (1.1.1-1) ...
Setting up golang-github-json-iterator-go-dev (1.1.4-1) ...
Setting up golang-github-hashicorp-go-retryablehttp-dev (0.6.4-1) ...
Setting up golang-github-aws-aws-sdk-go-dev (1.21.6+dfsg-2) ...
Setting up golang-github-opencontainers-specs-dev (1.0.1+git20190408.a1b50f6-1) ...
Setting up golang-github-syndtr-goleveldb-dev (0.0~git20170725.0.b89cc31-2) ...
Setting up golang-github-circonus-labs-circonus-gometrics-dev (2.3.1-2) ...
Setting up golang-github-gregjones-httpcache-dev (0.0~git20180305.9cad4c3-1) ...
Setting up golang-google-genproto-dev (0.0~git20190801.fa694d8-2) ...
Setting up dh-autoreconf (19) ...
Setting up golang-github-coreos-go-systemd-dev (22.0.0-1) ...
Setting up golang-github-opencontainers-runc-dev (1.0.0~rc9+dfsg1-1+rpi1) ...
Setting up golang-golang-x-text-dev (0.3.2-1) ...
Setting up debhelper (12.7.2) ...
Setting up golang-github-sap-go-hdb-dev (0.14.1-2) ...
Setting up golang-golang-x-net-dev (1:0.0+git20191112.2180aed+dfsg-1) ...
Setting up golang-github-vmware-govmomi-dev (0.15.0-1) ...
Setting up golang-golang-x-crypto-dev (1:0.0~git20190701.4def268-2) ...
Setting up golang-golang-x-oauth2-dev (0.0~git20190604.0f29369-2) ...
Setting up golang-golang-x-time-dev (0.0~git20161028.0.f51c127-2) ...
Setting up golang-github-opentracing-opentracing-go-dev (1.0.2-1) ...
Setting up dh-golang (1.43) ...
Setting up golang-github-gophercloud-gophercloud-dev (0.6.0-1) ...
Setting up golang-github-miekg-dns-dev (1.0.4+ds-1) ...
Setting up golang-github-coreos-pkg-dev (4-2) ...
Setting up golang-github-mwitkow-go-conntrack-dev (0.0~git20190716.2f06839-1) ...
Setting up golang-google-cloud-compute-metadata-dev (0.43.0-1) ...
Setting up golang-golang-x-tools-dev (1:0.0~git20191118.07fc4c7+ds-1) ...
Setting up golang-github-docker-go-connections-dev (0.4.0-1) ...
Setting up golang-github-packethost-packngo-dev (0.2.0-2) ...
Setting up golang-golang-x-oauth2-google-dev (0.0~git20190604.0f29369-2) ...
Setting up golang-gopkg-square-go-jose.v2-dev (2.4.1-1) ...
Setting up golang-github-azure-go-autorest-dev (10.15.5-1) ...
Setting up golang-github-googleapis-gnostic-dev (0.2.0-1) ...
Setting up golang-google-grpc-dev (1.22.1-1) ...
Setting up golang-github-ugorji-go-codec-dev (1.1.7-1) ...
Setting up golang-gopkg-vmihailenco-msgpack.v2-dev (4.2.2-1) ...
Setting up golang-go.opencensus-dev (0.22.0-1) ...
Setting up golang-github-hashicorp-mdns-dev (1.0.1-1) ...
Setting up golang-github-go-kit-kit-dev (0.6.0-2) ...
Setting up golang-github-hashicorp-go-msgpack-dev (0.5.5-1) ...
Setting up golang-github-hashicorp-net-rpc-msgpackrpc-dev (0.0~git20151116.0.a14192a-1) ...
Setting up golang-github-prometheus-common-dev (0.7.0-1) ...
Setting up golang-google-api-dev (0.7.0-2) ...
Setting up golang-github-prometheus-client-golang-dev (1.2.1-3) ...
Setting up golang-github-hashicorp-go-discover-dev (0.0+git20190905.34a6505-2) ...
Setting up golang-github-armon-go-metrics-dev (0.3.0-1) ...
Setting up golang-github-hashicorp-raft-dev (1.1.1-5) ...
Setting up golang-github-hashicorp-scada-client-dev (0.0~git20160601.0.6e89678-2) ...
Setting up golang-github-hashicorp-memberlist-dev (0.1.5-2) ...
Setting up golang-github-hashicorp-go-raftchunking-dev (0.6.2-2) ...
Setting up golang-github-hashicorp-raft-boltdb-dev (0.0~git20171010.6e5ba93-3) ...
Setting up golang-github-hashicorp-serf-dev (0.8.5~ds1-1) ...
Setting up sbuild-build-depends-consul-dummy (0.invalid.0) ...
Processing triggers for libc-bin (2.29-3+rpi1) ...
Processing triggers for ca-certificates (20190110) ...
Updating certificates in /etc/ssl/certs...
0 added, 0 removed; done.
Running hooks in /etc/ca-certificates/update.d...
done.
W: No sandbox user '_apt' on the system, can not drop privileges

+------------------------------------------------------------------------------+
| Build environment                                                            |
+------------------------------------------------------------------------------+

Kernel: Linux 4.9.0-0.bpo.4-armmp armhf (armv7l)
Toolchain package versions: binutils_2.33.1-5+rpi1 dpkg-dev_1.19.7 g++-9_9.2.1-19+rpi1+b1 gcc-9_9.2.1-19+rpi1+b1 libc6-dev_2.29-3+rpi1 libstdc++-9-dev_9.2.1-19+rpi1+b1 libstdc++6_9.2.1-19+rpi1+b1 linux-libc-dev_5.2.17-1+rpi1+b2
Package versions: adduser_3.118 apt_1.8.4 autoconf_2.69-11 automake_1:1.16.1-4 autopoint_0.19.8.1-10 autotools-dev_20180224.1 base-files_11+rpi1 base-passwd_3.5.46 bash_5.0-5 bash-completion_1:2.8-6 binutils_2.33.1-5+rpi1 binutils-arm-linux-gnueabihf_2.33.1-5+rpi1 binutils-common_2.33.1-5+rpi1 bsdmainutils_11.1.2 bsdutils_1:2.34-0.1 build-essential_12.8 bzip2_1.0.8-2 ca-certificates_20190110 coreutils_8.30-3 cpp_4:9.2.1-3.1+rpi1 cpp-9_9.2.1-19+rpi1+b1 dash_0.5.10.2-6 debconf_1.5.73 debhelper_12.7.2 debianutils_4.9 dh-autoreconf_19 dh-golang_1.43 dh-strip-nondeterminism_1.6.3-1 diffutils_1:3.7-3 dirmngr_2.2.17-3+b1 dpkg_1.19.7 dpkg-dev_1.19.7 dwz_0.13-5 e2fsprogs_1.45.4-1 fakeroot_1.24-1 fdisk_2.34-0.1 file_1:5.37-6 findutils_4.7.0-1 g++_4:9.2.1-3.1+rpi1 g++-9_9.2.1-19+rpi1+b1 gcc_4:9.2.1-3.1+rpi1 gcc-9_9.2.1-19+rpi1+b1 gcc-9-base_9.2.1-19+rpi1+b1 gettext_0.19.8.1-10 gettext-base_0.19.8.1-10 gnupg_2.2.17-3 gnupg-l10n_2.2.17-3 gnupg-utils_2.2.17-3+b1 gogoprotobuf_1.2.1+git20190611.dadb6258-1 golang-1.13-go_1.13.5-1+rpi1 golang-1.13-src_1.13.5-1+rpi1 golang-any_2:1.13~1+b14 golang-dbus-dev_5.0.3-1 golang-ginkgo-dev_1.2.0+git20161006.acfa16a-1 golang-github-alecthomas-units-dev_0.0~git20151022.0.2efee85-4 golang-github-armon-circbuf-dev_0.0~git20150827.0.bbbad09-2 golang-github-armon-go-metrics-dev_0.3.0-1 golang-github-armon-go-radix-dev_1.0.0-1 golang-github-asaskevich-govalidator-dev_9+git20180720.0.f9ffefc3-1 golang-github-aws-aws-sdk-go-dev_1.21.6+dfsg-2 golang-github-azure-go-autorest-dev_10.15.5-1 golang-github-beorn7-perks-dev_0.0~git20160804.0.4c0e845-1 golang-github-bgentry-speakeasy-dev_0.1.0-1 golang-github-boltdb-bolt-dev_1.3.1-6 golang-github-bradfitz-gomemcache-dev_0.0~git20141109-3 golang-github-cespare-xxhash-dev_2.1.1-1 golang-github-circonus-labs-circonus-gometrics-dev_2.3.1-2 golang-github-circonus-labs-circonusllhist-dev_0.0~git20160526.0.d724266-2 golang-github-coreos-go-systemd-dev_22.0.0-1 golang-github-coreos-pkg-dev_4-2 golang-github-cyphar-filepath-securejoin-dev_0.2.2-1 golang-github-datadog-datadog-go-dev_2.1.0-2 golang-github-davecgh-go-spew-dev_1.1.1-2 golang-github-denverdino-aliyungo-dev_0.0~git20180921.13fa8aa-2 golang-github-dgrijalva-jwt-go-dev_3.2.0-1 golang-github-dgrijalva-jwt-go-v3-dev_3.2.0-2 golang-github-digitalocean-godo-dev_1.1.0-1 golang-github-dimchansky-utfbom-dev_0.0~git20170328.6c6132f-1 golang-github-docker-go-connections-dev_0.4.0-1 golang-github-docker-go-units-dev_0.4.0-1 golang-github-docopt-docopt-go-dev_0.6.2+git20160216.0.784ddc5-1 golang-github-elazarl-go-bindata-assetfs-dev_1.0.0-1 golang-github-fatih-color-dev_1.7.0-1 golang-github-garyburd-redigo-dev_0.0~git20150901.0.d8dbe4d-2 golang-github-ghodss-yaml-dev_1.0.0-1 golang-github-go-ini-ini-dev_1.32.0-2 golang-github-go-kit-kit-dev_0.6.0-2 golang-github-go-logfmt-logfmt-dev_0.3.0-1 golang-github-go-stack-stack-dev_1.5.2-2 golang-github-go-test-deep-dev_1.0.3-1 golang-github-gogo-googleapis-dev_1.2.0-1 golang-github-gogo-protobuf-dev_1.2.1+git20190611.dadb6258-1 golang-github-golang-mock-dev_1.3.1-2 golang-github-golang-snappy-dev_0.0+git20160529.d9eb7a3-3 golang-github-google-btree-dev_1.0.0-1 golang-github-google-go-cmp-dev_0.3.1-1 golang-github-google-go-querystring-dev_1.0.0-1 golang-github-google-gofuzz-dev_0.0~git20170612.24818f7-1 golang-github-googleapis-gnostic-dev_0.2.0-1 golang-github-gophercloud-gophercloud-dev_0.6.0-1 golang-github-gregjones-httpcache-dev_0.0~git20180305.9cad4c3-1 golang-github-hashicorp-errwrap-dev_1.0.0-1 golang-github-hashicorp-go-checkpoint-dev_0.0~git20171009.1545e56-2 golang-github-hashicorp-go-cleanhttp-dev_0.5.1-1 golang-github-hashicorp-go-discover-dev_0.0+git20190905.34a6505-2 golang-github-hashicorp-go-hclog-dev_0.10.1-1 golang-github-hashicorp-go-immutable-radix-dev_1.1.0-1 golang-github-hashicorp-go-memdb-dev_0.0~git20180224.1289e7ff-1 golang-github-hashicorp-go-msgpack-dev_0.5.5-1 golang-github-hashicorp-go-multierror-dev_1.0.0-1 golang-github-hashicorp-go-raftchunking-dev_0.6.2-2 golang-github-hashicorp-go-reap-dev_0.0~git20160113.0.2d85522-3 golang-github-hashicorp-go-retryablehttp-dev_0.6.4-1 golang-github-hashicorp-go-rootcerts-dev_0.0~git20160503.0.6bb64b3-1 golang-github-hashicorp-go-sockaddr-dev_0.0~git20170627.41949a1+ds-2 golang-github-hashicorp-go-syslog-dev_0.0~git20150218.0.42a2b57-1 golang-github-hashicorp-go-uuid-dev_1.0.1-1 golang-github-hashicorp-go-version-dev_1.2.0-1 golang-github-hashicorp-golang-lru-dev_0.5.3-1 golang-github-hashicorp-hcl-dev_1.0.0-1 golang-github-hashicorp-hil-dev_0.0~git20160711.1e86c6b-1 golang-github-hashicorp-logutils-dev_0.0~git20150609.0.0dc08b1-1 golang-github-hashicorp-mdns-dev_1.0.1-1 golang-github-hashicorp-memberlist-dev_0.1.5-2 golang-github-hashicorp-net-rpc-msgpackrpc-dev_0.0~git20151116.0.a14192a-1 golang-github-hashicorp-raft-boltdb-dev_0.0~git20171010.6e5ba93-3 golang-github-hashicorp-raft-dev_1.1.1-5 golang-github-hashicorp-scada-client-dev_0.0~git20160601.0.6e89678-2 golang-github-hashicorp-serf-dev_0.8.5~ds1-1 golang-github-hashicorp-yamux-dev_0.0+git20190923.df201c7-1 golang-github-imdario-mergo-dev_0.3.5-1 golang-github-inconshreveable-muxado-dev_0.0~git20140312.0.f693c7e-2 golang-github-jeffail-gabs-dev_2.3.0-1 golang-github-jefferai-jsonx-dev_1.0.1-2 golang-github-jmespath-go-jmespath-dev_0.2.2-3 golang-github-jpillora-backoff-dev_1.0.0-1 golang-github-json-iterator-go-dev_1.1.4-1 golang-github-julienschmidt-httprouter-dev_1.1-5 golang-github-kr-pretty-dev_0.1.0-1 golang-github-kr-pty-dev_1.1.6-1 golang-github-kr-text-dev_0.1.0-1 golang-github-mattn-go-colorable-dev_0.0.9-3 golang-github-mattn-go-isatty-dev_0.0.8-2 golang-github-miekg-dns-dev_1.0.4+ds-1 golang-github-mitchellh-cli-dev_1.0.0-1 golang-github-mitchellh-copystructure-dev_0.0~git20161013.0.5af94ae-2 golang-github-mitchellh-go-homedir-dev_1.1.0-1 golang-github-mitchellh-go-testing-interface-dev_1.0.0-1 golang-github-mitchellh-hashstructure-dev_1.0.0-1 golang-github-mitchellh-mapstructure-dev_1.1.2-1 golang-github-mitchellh-reflectwalk-dev_0.0~git20170726.63d60e9-4 golang-github-modern-go-concurrent-dev_1.0.3-1 golang-github-modern-go-reflect2-dev_1.0.0-1 golang-github-mwitkow-go-conntrack-dev_0.0~git20190716.2f06839-1 golang-github-nytimes-gziphandler-dev_1.1.1-1 golang-github-opencontainers-runc-dev_1.0.0~rc9+dfsg1-1+rpi1 golang-github-opencontainers-selinux-dev_1.3.0-2 golang-github-opencontainers-specs-dev_1.0.1+git20190408.a1b50f6-1 golang-github-opentracing-opentracing-go-dev_1.0.2-1 golang-github-packethost-packngo-dev_0.2.0-2 golang-github-pascaldekloe-goe-dev_0.1.0-2 golang-github-peterbourgon-diskv-dev_3.0.0-1 golang-github-pkg-errors-dev_0.8.1-1 golang-github-pmezard-go-difflib-dev_1.0.0-2 golang-github-posener-complete-dev_1.1+git20180108.57878c9-3 golang-github-prometheus-client-golang-dev_1.2.1-3 golang-github-prometheus-client-model-dev_0.0.2+git20171117.99fa1f4-1 golang-github-prometheus-common-dev_0.7.0-1 golang-github-ryanuber-columnize-dev_2.1.1-1 golang-github-ryanuber-go-glob-dev_1.0.0-2 golang-github-sap-go-hdb-dev_0.14.1-2 golang-github-seccomp-libseccomp-golang-dev_0.9.1-1 golang-github-shirou-gopsutil-dev_2.18.06-1 golang-github-sirupsen-logrus-dev_1.4.2-1 golang-github-spf13-pflag-dev_1.0.3-1 golang-github-stretchr-objx-dev_0.1.1+git20180825.ef50b0d-1 golang-github-stretchr-testify-dev_1.4.0+ds-1 golang-github-syndtr-goleveldb-dev_0.0~git20170725.0.b89cc31-2 golang-github-tent-http-link-go-dev_0.0~git20130702.0.ac974c6-6 golang-github-tv42-httpunix-dev_0.0~git20150427.b75d861-2 golang-github-ugorji-go-codec-dev_1.1.7-1 golang-github-ugorji-go-msgpack-dev_0.0~git20130605.792643-5 golang-github-urfave-cli-dev_1.20.0-1 golang-github-vishvananda-netlink-dev_1.0.0+git20181030.023a6da-1 golang-github-vishvananda-netns-dev_0.0~git20170707.0.86bef33-1 golang-github-vmihailenco-tagparser-dev_0.1.1-2 golang-github-vmware-govmomi-dev_0.15.0-1 golang-github-xeipuuv-gojsonpointer-dev_0.0~git20151027.0.e0fe6f6-2 golang-github-xeipuuv-gojsonreference-dev_0.0~git20150808.0.e02fc20-2 golang-github-xeipuuv-gojsonschema-dev_0.0~git20170210.0.6b67b3f-2 golang-glog-dev_0.0~git20160126.23def4e-3 golang-go_2:1.13~1+b14 golang-go.opencensus-dev_0.22.0-1 golang-gocapability-dev_0.0+git20180916.d983527-1 golang-gogoprotobuf-dev_1.2.1+git20190611.dadb6258-1 golang-golang-x-crypto-dev_1:0.0~git20190701.4def268-2 golang-golang-x-net-dev_1:0.0+git20191112.2180aed+dfsg-1 golang-golang-x-oauth2-dev_0.0~git20190604.0f29369-2 golang-golang-x-oauth2-google-dev_0.0~git20190604.0f29369-2 golang-golang-x-sync-dev_0.0~git20190423.1122301-1 golang-golang-x-sys-dev_0.0~git20190726.fc99dfb-1 golang-golang-x-text-dev_0.3.2-1 golang-golang-x-time-dev_0.0~git20161028.0.f51c127-2 golang-golang-x-tools_1:0.0~git20191118.07fc4c7+ds-1 golang-golang-x-tools-dev_1:0.0~git20191118.07fc4c7+ds-1 golang-golang-x-xerrors-dev_0.0~git20190717.a985d34-1 golang-gomega-dev_1.0+git20160910.d59fa0a-1 golang-google-api-dev_0.7.0-2 golang-google-cloud-compute-metadata-dev_0.43.0-1 golang-google-genproto-dev_0.0~git20190801.fa694d8-2 golang-google-grpc-dev_1.22.1-1 golang-gopkg-alecthomas-kingpin.v2-dev_2.2.6-1 golang-gopkg-check.v1-dev_0.0+git20180628.788fd78-1 golang-gopkg-inf.v0-dev_0.9.0-3 golang-gopkg-mgo.v2-dev_2016.08.01-6 golang-gopkg-square-go-jose.v2-dev_2.4.1-1 golang-gopkg-tomb.v2-dev_0.0~git20161208.d5d1b58-3 golang-gopkg-vmihailenco-msgpack.v2-dev_4.2.2-1 golang-gopkg-yaml.v2-dev_2.2.2-1 golang-goprotobuf-dev_1.3.2-2 golang-procfs-dev_0.0.3-1 golang-protobuf-extensions-dev_1.0.1-1 golang-src_2:1.13~1+b14 gpg_2.2.17-3+b1 gpg-agent_2.2.17-3+b1 gpg-wks-client_2.2.17-3+b1 gpg-wks-server_2.2.17-3+b1 gpgconf_2.2.17-3+b1 gpgsm_2.2.17-3+b1 gpgv_2.2.17-3+b1 grep_3.3-1 groff-base_1.22.4-4 gzip_1.9-3 hostname_3.23 init-system-helpers_1.57 intltool-debian_0.35.0+20060710.5 iproute2_5.4.0-1 iputils-ping_3:20190709-2 libacl1_2.2.53-5 libapt-pkg5.0_1.8.4 libarchive-zip-perl_1.67-1 libasan5_9.2.1-19+rpi1+b1 libassuan0_2.5.3-7 libatomic1_9.2.1-19+rpi1+b1 libattr1_1:2.4.48-5 libaudit-common_1:2.8.5-2 libaudit1_1:2.8.5-2+b1 libbinutils_2.33.1-5+rpi1 libblkid1_2.34-0.1 libbsd0_0.10.0-1 libbz2-1.0_1.0.8-2 libc-bin_2.29-3+rpi1 libc-dev-bin_2.29-3+rpi1 libc6_2.29-3+rpi1 libc6-dev_2.29-3+rpi1 libcap-ng0_0.7.9-2.1 libcap2_1:2.27-1 libcap2-bin_1:2.27-1 libcc1-0_9.2.1-19+rpi1+b1 libcom-err2_1.45.4-1 libcroco3_0.6.13-1 libdb5.3_5.3.28+dfsg1-0.6 libdebconfclient0_0.250 libdebhelper-perl_12.7.2 libdpkg-perl_1.19.7 libelf1_0.176-1.1 libext2fs2_1.45.4-1 libfakeroot_1.24-1 libfdisk1_2.34-0.1 libffi6_3.2.1-9 libfile-stripnondeterminism-perl_1.6.3-1 libgcc-9-dev_9.2.1-19+rpi1+b1 libgcc1_1:9.2.1-19+rpi1+b1 libgcrypt20_1.8.5-3 libgdbm-compat4_1.18.1-5 libgdbm6_1.18.1-5 libglib2.0-0_2.62.3-2 libgmp10_2:6.1.2+dfsg-4 libgnutls30_3.6.10-5 libgomp1_9.2.1-19+rpi1+b1 libgpg-error0_1.36-7 libhogweed5_3.5.1+really3.5.1-2 libicu63_63.2-2 libidn2-0_2.2.0-2 libisl22_0.22-2 libjs-jquery_3.3.1~dfsg-3 libjs-jquery-ui_1.12.1+dfsg-5 libksba8_1.3.5-2 libldap-2.4-2_2.4.48+dfsg-1+b2 libldap-common_2.4.48+dfsg-1 liblz4-1_1.9.2-2 liblzma5_5.2.4-1 libmagic-mgc_1:5.37-6 libmagic1_1:5.37-6 libmnl0_1.0.4-2 libmount1_2.34-0.1 libmpc3_1.1.0-1 libmpfr6_4.0.2-1 libncurses6_6.1+20191019-1 libncursesw6_6.1+20191019-1 libnettle7_3.5.1+really3.5.1-2 libnpth0_1.6-1 libp11-kit0_0.23.18.1-2 libpam-cap_1:2.27-1 libpam-modules_1.3.1-5 libpam-modules-bin_1.3.1-5 libpam-runtime_1.3.1-5 libpam0g_1.3.1-5 libpcre2-8-0_10.34-3 libpcre3_2:8.39-12 libperl5.30_5.30.0-9 libpipeline1_1.5.1-3 libprocps7_2:3.3.15-2 libprotobuf-dev_3.6.1.3-2+rpi1 libprotobuf-lite17_3.6.1.3-2+rpi1 libprotobuf17_3.6.1.3-2+rpi1 libprotoc17_3.6.1.3-2+rpi1 libreadline7_7.0-5 libreadline8_8.0-3 libsasl2-2_2.1.27+dfsg-1+b1 libsasl2-dev_2.1.27+dfsg-1+b1 libsasl2-modules-db_2.1.27+dfsg-1+b1 libseccomp-dev_2.4.2-2+rpi1 libseccomp2_2.4.2-2+rpi1 libselinux1_2.9-3 libsemanage-common_2.9-3 libsemanage1_2.9-3 libsepol1_2.9-2 libsigsegv2_2.12-2 libsmartcols1_2.34-0.1 libsqlite3-0_3.30.1-1 libss2_1.45.4-1 libssl1.1_1.1.1d-2 libstdc++-9-dev_9.2.1-19+rpi1+b1 libstdc++6_9.2.1-19+rpi1+b1 libsub-override-perl_0.09-2 libsystemd-dev_244-3+rpi1+b1 libsystemd0_244-3+rpi1+b1 libtasn1-6_4.14-3 libtinfo5_6.1+20191019-1 libtinfo6_6.1+20191019-1 libtool_2.4.6-11 libubsan1_9.2.1-19+rpi1+b1 libuchardet0_0.0.6-3 libudev1_243-8+rpi1 libunistring2_0.9.10-2 libuuid1_2.34-0.1 libxml2_2.9.4+dfsg1-8 libxtables12_1.8.3-2 libzstd1_1.4.4+dfsg-1+rpi1 linux-libc-dev_5.2.17-1+rpi1+b2 login_1:4.7-2 logsave_1.45.4-1 lsb-base_11.1.0+rpi1 lsof_4.93.2+dfsg-1 m4_1.4.18-4 make_4.2.1-1.2 man-db_2.9.0-2 mawk_1.3.3-17 mockery_0.0~git20181123.e78b021-2 mount_2.34-0.1 ncurses-base_6.1+20191019-1 ncurses-bin_6.1+20191019-1 netbase_5.7 openssl_1.1.1d-2 passwd_1:4.7-2 patch_2.7.6-6 perl_5.30.0-9 perl-base_5.30.0-9 perl-modules-5.30_5.30.0-9 pinentry-curses_1.1.0-3 pkg-config_0.29-6 po-debconf_1.0.21 procps_2:3.3.15-2 protobuf-compiler_3.6.1.3-2+rpi1 raspbian-archive-keyring_20120528.2 readline-common_8.0-3 sbuild-build-depends-consul-dummy_0.invalid.0 sbuild-build-depends-core-dummy_0.invalid.0 sed_4.7-1 sensible-utils_0.0.12+nmu1 sysvinit-utils_2.96-1 tar_1.30+dfsg-6 tzdata_2019c-3 util-linux_2.34-0.1 xz-utils_5.2.4-1 zlib1g_1:1.2.11.dfsg-1 zlib1g-dev_1:1.2.11.dfsg-1

+------------------------------------------------------------------------------+
| Build                                                                        |
+------------------------------------------------------------------------------+


Unpack source
-------------

gpgv: unknown type of key resource 'trustedkeys.kbx'
gpgv: keyblock resource '/sbuild-nonexistent/.gnupg/trustedkeys.kbx': General error
gpgv: Signature made Sun Dec  1 00:03:51 2019 UTC
gpgv:                using RSA key 50BC7CF939D20C272A6B065652B6BBD953968D1B
gpgv: Can't check signature: No public key
dpkg-source: warning: failed to verify signature on ./consul_1.5.2+dfsg1-6.dsc
dpkg-source: info: extracting consul in /<<BUILDDIR>>/consul-1.5.2+dfsg1
dpkg-source: info: unpacking consul_1.5.2+dfsg1.orig.tar.xz
dpkg-source: info: unpacking consul_1.5.2+dfsg1-6.debian.tar.xz
dpkg-source: info: using patch list from debian/patches/series
dpkg-source: info: applying provider-no-k8s.patch
dpkg-source: info: applying t-skip-unreliable-tests.patch
dpkg-source: info: applying vendor-envoyproxy.patch

Check disc space
----------------

Sufficient free space for build

User Environment
----------------

APT_CONFIG=/var/lib/sbuild/apt.conf
DEB_BUILD_OPTIONS=parallel=4
HOME=/sbuild-nonexistent
LC_ALL=POSIX
LOGNAME=buildd
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
SCHROOT_ALIAS_NAME=bullseye-staging-armhf-sbuild
SCHROOT_CHROOT_NAME=bullseye-staging-armhf-sbuild
SCHROOT_COMMAND=env
SCHROOT_GID=109
SCHROOT_GROUP=buildd
SCHROOT_SESSION_ID=bullseye-staging-armhf-sbuild-8db272d1-67b2-4e85-b2ff-2e9782482bc8
SCHROOT_UID=104
SCHROOT_USER=buildd
SHELL=/bin/sh
TERM=linux
USER=buildd

dpkg-buildpackage
-----------------

dpkg-buildpackage: info: source package consul
dpkg-buildpackage: info: source version 1.5.2+dfsg1-6
dpkg-buildpackage: info: source distribution unstable
 dpkg-source --before-build .
dpkg-buildpackage: info: host architecture armhf
 fakeroot debian/rules clean
dh clean --buildsystem=golang --with=golang,bash-completion --builddirectory=_build
   dh_auto_clean -O--buildsystem=golang -O--builddirectory=_build
   dh_autoreconf_clean -O--buildsystem=golang -O--builddirectory=_build
   debian/rules override_dh_clean
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
dh_clean
## Remove Files-Excluded (when built from checkout or non-DFSG tarball):
rm -f -rv `perl -0nE 'say $1 if m{^Files\-Excluded\:\s*(.*?)(?:\n\n|Files:|Comment:)}sm;' debian/copyright`
find vendor -type d -empty -delete -print
vendor/github.com/Azure
vendor/github.com/DataDog
vendor/github.com/Jeffail
vendor/github.com/Microsoft
vendor/github.com/NYTimes
vendor/github.com/SAP
vendor/github.com/SermoDigital
vendor/github.com/StackExchange
vendor/github.com/armon
vendor/github.com/asaskevich
vendor/github.com/aws
vendor/github.com/beorn7
vendor/github.com/bgentry
vendor/github.com/boltdb
vendor/github.com/circonus-labs
vendor/github.com/davecgh
vendor/github.com/denisenkom
vendor/github.com/denverdino
vendor/github.com/dgrijalva
vendor/github.com/digitalocean
vendor/github.com/docker
vendor/github.com/elazarl
vendor/github.com/fatih
vendor/github.com/ghodss
vendor/github.com/go-ini
vendor/github.com/go-ole
vendor/github.com/go-sql-driver
vendor/github.com/gocql
vendor/github.com/gogo
vendor/github.com/golang
vendor/github.com/google
vendor/github.com/googleapis
vendor/github.com/gophercloud
vendor/github.com/gregjones
vendor/github.com/hailocab
vendor/github.com/imdario
vendor/github.com/jefferai
vendor/github.com/jmespath
vendor/github.com/joyent
vendor/github.com/json-iterator
vendor/github.com/keybase
vendor/github.com/kr
vendor/github.com/lib
vendor/github.com/mattn
vendor/github.com/matttproud
vendor/github.com/miekg
vendor/github.com/mitchellh
vendor/github.com/modern-go
vendor/github.com/nicolai86
vendor/github.com/packethost
vendor/github.com/pascaldekloe
vendor/github.com/patrickmn
vendor/github.com/peterbourgon
vendor/github.com/pkg
vendor/github.com/pmezard
vendor/github.com/posener
vendor/github.com/prometheus
vendor/github.com/renier
vendor/github.com/ryanuber
vendor/github.com/shirou
vendor/github.com/sirupsen
vendor/github.com/softlayer
vendor/github.com/spf13
vendor/github.com/stretchr
vendor/github.com/tv42
vendor/github.com/vmware
vendor/golang.org
vendor/gopkg.in/square
vendor/gopkg.in
rm -f -r test/integration
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
 debian/rules build-arch
dh build-arch --buildsystem=golang --with=golang,bash-completion --builddirectory=_build
   dh_update_autotools_config -a -O--buildsystem=golang -O--builddirectory=_build
   dh_autoreconf -a -O--buildsystem=golang -O--builddirectory=_build
   debian/rules override_dh_auto_configure
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
dh_auto_configure
mkdir -v -p _build/src/github.com/keybase/
mkdir: created directory '_build/src/github.com/keybase/'
ln -sv /usr/share/gocode/src/golang.org/x/crypto  _build/src/github.com/keybase/go-crypto
'_build/src/github.com/keybase/go-crypto' -> '/usr/share/gocode/src/golang.org/x/crypto'
mkdir -v -p _build/src/github.com/SermoDigital/
mkdir: created directory '_build/src/github.com/SermoDigital/'
ln -sv /usr/share/gocode/src/gopkg.in/square/go-jose.v1  _build/src/github.com/SermoDigital/jose
'_build/src/github.com/SermoDigital/jose' -> '/usr/share/gocode/src/gopkg.in/square/go-jose.v1'
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
   debian/rules override_dh_auto_build
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
export GOPATH=/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build \
        && /usr/bin/make -C _build/src/github.com/hashicorp/consul --makefile=/<<BUILDDIR>>/consul-1.5.2+dfsg1/GNUmakefile proto
make[2]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul'
protoc agent/connect/ca/plugin/*.proto --gofast_out=plugins=grpc:../../..
bash: git: command not found
bash: git: command not found
failed to initialize build cache at /sbuild-nonexistent/.cache/go-build: mkdir /sbuild-nonexistent: permission denied
bash: git: command not found
bash: git: command not found
bash: git: command not found
bash: git: command not found
make[2]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul'
dh_auto_build -v
	cd _build && go generate -v github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/checks github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/connect github.com/hashicorp/consul/agent/connect/ca github.com/hashicorp/consul/agent/connect/ca/plugin github.com/hashicorp/consul/agent/consul github.com/hashicorp/consul/agent/consul/authmethod github.com/hashicorp/consul/agent/consul/authmethod/kubeauth github.com/hashicorp/consul/agent/consul/authmethod/testauth github.com/hashicorp/consul/agent/consul/autopilot github.com/hashicorp/consul/agent/consul/fsm github.com/hashicorp/consul/agent/consul/prepared_query github.com/hashicorp/consul/agent/consul/state github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/api github.com/hashicorp/consul/api/watch github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/tls github.com/hashicorp/consul/command/tls/ca github.com/hashicorp/consul/command/tls/ca/create github.com/hashicorp/consul/command/tls/cert github.com/hashicorp/consul/command/tls/cert/create github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version
src/github.com/hashicorp/consul/main.go
src/github.com/hashicorp/consul/main_test.go
src/github.com/hashicorp/consul/acl/acl.go
src/github.com/hashicorp/consul/acl/acl_test.go
src/github.com/hashicorp/consul/acl/errors.go
src/github.com/hashicorp/consul/acl/policy.go
src/github.com/hashicorp/consul/acl/policy_test.go
src/github.com/hashicorp/consul/agent/acl.go
src/github.com/hashicorp/consul/agent/acl_endpoint.go
src/github.com/hashicorp/consul/agent/acl_endpoint_legacy.go
src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go
src/github.com/hashicorp/consul/agent/acl_endpoint_test.go
src/github.com/hashicorp/consul/agent/acl_test.go
src/github.com/hashicorp/consul/agent/agent.go
src/github.com/hashicorp/consul/agent/agent_endpoint.go
src/github.com/hashicorp/consul/agent/agent_endpoint_test.go
src/github.com/hashicorp/consul/agent/agent_test.go
src/github.com/hashicorp/consul/agent/bindata_assetfs.go
src/github.com/hashicorp/consul/agent/blacklist.go
src/github.com/hashicorp/consul/agent/blacklist_test.go
src/github.com/hashicorp/consul/agent/catalog_endpoint.go
src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go
src/github.com/hashicorp/consul/agent/check.go
src/github.com/hashicorp/consul/agent/config.go
src/github.com/hashicorp/consul/agent/config_endpoint.go
src/github.com/hashicorp/consul/agent/config_endpoint_test.go
src/github.com/hashicorp/consul/agent/connect_auth.go
src/github.com/hashicorp/consul/agent/connect_ca_endpoint.go
src/github.com/hashicorp/consul/agent/connect_ca_endpoint_test.go
src/github.com/hashicorp/consul/agent/coordinate_endpoint.go
src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go
src/github.com/hashicorp/consul/agent/dns.go
src/github.com/hashicorp/consul/agent/dns_test.go
src/github.com/hashicorp/consul/agent/enterprise_delegate_oss.go
src/github.com/hashicorp/consul/agent/event_endpoint.go
src/github.com/hashicorp/consul/agent/event_endpoint_test.go
src/github.com/hashicorp/consul/agent/health_endpoint.go
src/github.com/hashicorp/consul/agent/health_endpoint_test.go
src/github.com/hashicorp/consul/agent/http.go
src/github.com/hashicorp/consul/agent/http_oss.go
src/github.com/hashicorp/consul/agent/http_oss_test.go
src/github.com/hashicorp/consul/agent/http_test.go
src/github.com/hashicorp/consul/agent/intentions_endpoint.go
src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go
src/github.com/hashicorp/consul/agent/keyring.go
src/github.com/hashicorp/consul/agent/keyring_test.go
src/github.com/hashicorp/consul/agent/kvs_endpoint.go
src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go
src/github.com/hashicorp/consul/agent/notify.go
src/github.com/hashicorp/consul/agent/notify_test.go
src/github.com/hashicorp/consul/agent/operator_endpoint.go
src/github.com/hashicorp/consul/agent/operator_endpoint_test.go
src/github.com/hashicorp/consul/agent/prepared_query_endpoint.go
src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go
src/github.com/hashicorp/consul/agent/remote_exec.go
src/github.com/hashicorp/consul/agent/remote_exec_test.go
src/github.com/hashicorp/consul/agent/retry_join.go
src/github.com/hashicorp/consul/agent/service_manager.go
src/github.com/hashicorp/consul/agent/service_manager_test.go
src/github.com/hashicorp/consul/agent/session_endpoint.go
src/github.com/hashicorp/consul/agent/session_endpoint_test.go
src/github.com/hashicorp/consul/agent/sidecar_service.go
src/github.com/hashicorp/consul/agent/sidecar_service_test.go
src/github.com/hashicorp/consul/agent/signal_unix.go
src/github.com/hashicorp/consul/agent/snapshot_endpoint.go
src/github.com/hashicorp/consul/agent/snapshot_endpoint_test.go
src/github.com/hashicorp/consul/agent/status_endpoint.go
src/github.com/hashicorp/consul/agent/status_endpoint_test.go
src/github.com/hashicorp/consul/agent/testagent.go
src/github.com/hashicorp/consul/agent/testagent_test.go
src/github.com/hashicorp/consul/agent/translate_addr.go
src/github.com/hashicorp/consul/agent/txn_endpoint.go
src/github.com/hashicorp/consul/agent/txn_endpoint_test.go
src/github.com/hashicorp/consul/agent/ui_endpoint.go
src/github.com/hashicorp/consul/agent/ui_endpoint_test.go
src/github.com/hashicorp/consul/agent/user_event.go
src/github.com/hashicorp/consul/agent/user_event_test.go
src/github.com/hashicorp/consul/agent/util.go
src/github.com/hashicorp/consul/agent/util_test.go
src/github.com/hashicorp/consul/agent/watch_handler.go
src/github.com/hashicorp/consul/agent/watch_handler_test.go
src/github.com/hashicorp/consul/agent/ae/ae.go
src/github.com/hashicorp/consul/agent/ae/ae_test.go
src/github.com/hashicorp/consul/agent/ae/trigger.go
src/github.com/hashicorp/consul/agent/cache/cache.go
Generating mock for: Request in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/mock_Request.go
Generating mock for: Type in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/mock_Type.go
src/github.com/hashicorp/consul/agent/cache/cache_test.go
src/github.com/hashicorp/consul/agent/cache/entry.go
src/github.com/hashicorp/consul/agent/cache/entry_test.go
src/github.com/hashicorp/consul/agent/cache/mock_Request.go
src/github.com/hashicorp/consul/agent/cache/mock_Type.go
src/github.com/hashicorp/consul/agent/cache/request.go
src/github.com/hashicorp/consul/agent/cache/testing.go
src/github.com/hashicorp/consul/agent/cache/type.go
src/github.com/hashicorp/consul/agent/cache/watch.go
src/github.com/hashicorp/consul/agent/cache/watch_test.go
src/github.com/hashicorp/consul/agent/cache-types/catalog_services.go
src/github.com/hashicorp/consul/agent/cache-types/catalog_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_leaf.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_leaf_test.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root_test.go
src/github.com/hashicorp/consul/agent/cache-types/health_services.go
src/github.com/hashicorp/consul/agent/cache-types/health_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/intention_match.go
src/github.com/hashicorp/consul/agent/cache-types/intention_match_test.go
src/github.com/hashicorp/consul/agent/cache-types/mock_RPC.go
src/github.com/hashicorp/consul/agent/cache-types/node_services.go
src/github.com/hashicorp/consul/agent/cache-types/node_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/prepared_query.go
src/github.com/hashicorp/consul/agent/cache-types/prepared_query_test.go
src/github.com/hashicorp/consul/agent/cache-types/resolved_service_config.go
src/github.com/hashicorp/consul/agent/cache-types/resolved_service_config_test.go
src/github.com/hashicorp/consul/agent/cache-types/rpc.go
Generating mock for: RPC in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/mock_RPC.go
src/github.com/hashicorp/consul/agent/cache-types/testing.go
src/github.com/hashicorp/consul/agent/checks/alias.go
src/github.com/hashicorp/consul/agent/checks/alias_test.go
src/github.com/hashicorp/consul/agent/checks/check.go
src/github.com/hashicorp/consul/agent/checks/check_test.go
src/github.com/hashicorp/consul/agent/checks/docker.go
src/github.com/hashicorp/consul/agent/checks/docker_unix.go
src/github.com/hashicorp/consul/agent/checks/grpc.go
src/github.com/hashicorp/consul/agent/checks/grpc_test.go
src/github.com/hashicorp/consul/agent/config/builder.go
src/github.com/hashicorp/consul/agent/config/config.go
src/github.com/hashicorp/consul/agent/config/default.go
src/github.com/hashicorp/consul/agent/config/default_oss.go
src/github.com/hashicorp/consul/agent/config/doc.go
src/github.com/hashicorp/consul/agent/config/flags.go
src/github.com/hashicorp/consul/agent/config/flags_test.go
src/github.com/hashicorp/consul/agent/config/flagset.go
src/github.com/hashicorp/consul/agent/config/merge.go
src/github.com/hashicorp/consul/agent/config/merge_test.go
src/github.com/hashicorp/consul/agent/config/patch_hcl.go
src/github.com/hashicorp/consul/agent/config/patch_hcl_test.go
src/github.com/hashicorp/consul/agent/config/runtime.go
src/github.com/hashicorp/consul/agent/config/runtime_test.go
src/github.com/hashicorp/consul/agent/config/segment_oss.go
src/github.com/hashicorp/consul/agent/config/segment_oss_test.go
src/github.com/hashicorp/consul/agent/connect/csr.go
src/github.com/hashicorp/consul/agent/connect/generate.go
src/github.com/hashicorp/consul/agent/connect/parsing.go
src/github.com/hashicorp/consul/agent/connect/testing_ca.go
src/github.com/hashicorp/consul/agent/connect/testing_ca_test.go
src/github.com/hashicorp/consul/agent/connect/testing_spiffe.go
src/github.com/hashicorp/consul/agent/connect/uri.go
src/github.com/hashicorp/consul/agent/connect/uri_agent.go
src/github.com/hashicorp/consul/agent/connect/uri_agent_test.go
src/github.com/hashicorp/consul/agent/connect/uri_service.go
src/github.com/hashicorp/consul/agent/connect/uri_service_test.go
src/github.com/hashicorp/consul/agent/connect/uri_signing.go
src/github.com/hashicorp/consul/agent/connect/uri_signing_test.go
src/github.com/hashicorp/consul/agent/connect/uri_test.go
src/github.com/hashicorp/consul/agent/connect/ca/mock_Provider.go
src/github.com/hashicorp/consul/agent/connect/ca/provider.go
Generating mock for: Provider in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/connect/ca/mock_Provider.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul_config.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul_test.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_vault.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_vault_test.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/client.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/plugin.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/plugin_test.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/provider.pb.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/serve.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/transport_grpc.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/transport_netrpc.go
src/github.com/hashicorp/consul/agent/consul/acl.go
src/github.com/hashicorp/consul/agent/consul/acl_authmethod.go
src/github.com/hashicorp/consul/agent/consul/acl_authmethod_test.go
src/github.com/hashicorp/consul/agent/consul/acl_client.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint_legacy.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_legacy.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_legacy_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_types.go
src/github.com/hashicorp/consul/agent/consul/acl_server.go
src/github.com/hashicorp/consul/agent/consul/acl_test.go
src/github.com/hashicorp/consul/agent/consul/acl_token_exp.go
src/github.com/hashicorp/consul/agent/consul/acl_token_exp_test.go
src/github.com/hashicorp/consul/agent/consul/auto_encrypt.go
src/github.com/hashicorp/consul/agent/consul/auto_encrypt_endpoint.go
src/github.com/hashicorp/consul/agent/consul/auto_encrypt_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot.go
src/github.com/hashicorp/consul/agent/consul/autopilot_oss.go
src/github.com/hashicorp/consul/agent/consul/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go
src/github.com/hashicorp/consul/agent/consul/catalog_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/client.go
src/github.com/hashicorp/consul/agent/consul/client_serf.go
src/github.com/hashicorp/consul/agent/consul/client_test.go
src/github.com/hashicorp/consul/agent/consul/config.go
src/github.com/hashicorp/consul/agent/consul/config_endpoint.go
src/github.com/hashicorp/consul/agent/consul/config_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/config_replication.go
src/github.com/hashicorp/consul/agent/consul/config_replication_test.go
src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go
src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/consul_ca_delegate.go
src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go
src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/enterprise_client_oss.go
src/github.com/hashicorp/consul/agent/consul/enterprise_server_oss.go
src/github.com/hashicorp/consul/agent/consul/filter.go
src/github.com/hashicorp/consul/agent/consul/filter_test.go
src/github.com/hashicorp/consul/agent/consul/flood.go
src/github.com/hashicorp/consul/agent/consul/health_endpoint.go
src/github.com/hashicorp/consul/agent/consul/health_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/helper_test.go
src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go
src/github.com/hashicorp/consul/agent/consul/intention_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/internal_endpoint.go
src/github.com/hashicorp/consul/agent/consul/internal_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/issue_test.go
src/github.com/hashicorp/consul/agent/consul/kvs_endpoint.go
src/github.com/hashicorp/consul/agent/consul/kvs_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/leader.go
src/github.com/hashicorp/consul/agent/consul/leader_oss.go
src/github.com/hashicorp/consul/agent/consul/leader_test.go
src/github.com/hashicorp/consul/agent/consul/merge.go
src/github.com/hashicorp/consul/agent/consul/merge_test.go
src/github.com/hashicorp/consul/agent/consul/operator_autopilot_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_autopilot_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/operator_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_raft_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_raft_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query_endpoint.go
src/github.com/hashicorp/consul/agent/consul/prepared_query_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/raft_rpc.go
src/github.com/hashicorp/consul/agent/consul/replication.go
src/github.com/hashicorp/consul/agent/consul/rpc.go
src/github.com/hashicorp/consul/agent/consul/rpc_test.go
src/github.com/hashicorp/consul/agent/consul/rtt.go
src/github.com/hashicorp/consul/agent/consul/rtt_test.go
src/github.com/hashicorp/consul/agent/consul/segment_oss.go
src/github.com/hashicorp/consul/agent/consul/serf_test.go
src/github.com/hashicorp/consul/agent/consul/server.go
src/github.com/hashicorp/consul/agent/consul/server_lookup.go
src/github.com/hashicorp/consul/agent/consul/server_lookup_test.go
src/github.com/hashicorp/consul/agent/consul/server_oss.go
src/github.com/hashicorp/consul/agent/consul/server_serf.go
src/github.com/hashicorp/consul/agent/consul/server_test.go
src/github.com/hashicorp/consul/agent/consul/session_endpoint.go
src/github.com/hashicorp/consul/agent/consul/session_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/session_timers.go
src/github.com/hashicorp/consul/agent/consul/session_timers_test.go
src/github.com/hashicorp/consul/agent/consul/session_ttl.go
src/github.com/hashicorp/consul/agent/consul/session_ttl_test.go
src/github.com/hashicorp/consul/agent/consul/snapshot_endpoint.go
src/github.com/hashicorp/consul/agent/consul/snapshot_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/stats_fetcher.go
src/github.com/hashicorp/consul/agent/consul/stats_fetcher_test.go
src/github.com/hashicorp/consul/agent/consul/status_endpoint.go
src/github.com/hashicorp/consul/agent/consul/status_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/txn_endpoint.go
src/github.com/hashicorp/consul/agent/consul/txn_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/util.go
src/github.com/hashicorp/consul/agent/consul/util_test.go
src/github.com/hashicorp/consul/agent/consul/authmethod/authmethods.go
src/github.com/hashicorp/consul/agent/consul/authmethod/kubeauth/k8s.go
src/github.com/hashicorp/consul/agent/consul/authmethod/kubeauth/k8s_test.go
src/github.com/hashicorp/consul/agent/consul/authmethod/kubeauth/testing.go
src/github.com/hashicorp/consul/agent/consul/authmethod/testauth/testing.go
src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go
src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/promotion.go
src/github.com/hashicorp/consul/agent/consul/autopilot/promotion_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/structs.go
src/github.com/hashicorp/consul/agent/consul/autopilot/structs_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/commands_oss.go
src/github.com/hashicorp/consul/agent/consul/fsm/commands_oss_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/fsm.go
src/github.com/hashicorp/consul/agent/consul/fsm/fsm_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot_oss.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot_oss_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/template.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/template_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/walk.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/walk_test.go
src/github.com/hashicorp/consul/agent/consul/state/acl.go
src/github.com/hashicorp/consul/agent/consul/state/acl_test.go
src/github.com/hashicorp/consul/agent/consul/state/autopilot.go
src/github.com/hashicorp/consul/agent/consul/state/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/state/catalog.go
src/github.com/hashicorp/consul/agent/consul/state/catalog_test.go
src/github.com/hashicorp/consul/agent/consul/state/config_entry.go
src/github.com/hashicorp/consul/agent/consul/state/config_entry_test.go
src/github.com/hashicorp/consul/agent/consul/state/connect_ca.go
src/github.com/hashicorp/consul/agent/consul/state/connect_ca_test.go
src/github.com/hashicorp/consul/agent/consul/state/coordinate.go
src/github.com/hashicorp/consul/agent/consul/state/coordinate_test.go
src/github.com/hashicorp/consul/agent/consul/state/delay.go
src/github.com/hashicorp/consul/agent/consul/state/delay_test.go
src/github.com/hashicorp/consul/agent/consul/state/graveyard.go
src/github.com/hashicorp/consul/agent/consul/state/graveyard_test.go
src/github.com/hashicorp/consul/agent/consul/state/index_connect.go
src/github.com/hashicorp/consul/agent/consul/state/index_connect_test.go
src/github.com/hashicorp/consul/agent/consul/state/intention.go
src/github.com/hashicorp/consul/agent/consul/state/intention_test.go
src/github.com/hashicorp/consul/agent/consul/state/kvs.go
src/github.com/hashicorp/consul/agent/consul/state/kvs_test.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_index.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_index_test.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_test.go
src/github.com/hashicorp/consul/agent/consul/state/schema.go
src/github.com/hashicorp/consul/agent/consul/state/schema_test.go
src/github.com/hashicorp/consul/agent/consul/state/session.go
src/github.com/hashicorp/consul/agent/consul/state/session_test.go
src/github.com/hashicorp/consul/agent/consul/state/state_store.go
src/github.com/hashicorp/consul/agent/consul/state/state_store_test.go
src/github.com/hashicorp/consul/agent/consul/state/tombstone_gc.go
src/github.com/hashicorp/consul/agent/consul/state/tombstone_gc_test.go
src/github.com/hashicorp/consul/agent/consul/state/txn.go
src/github.com/hashicorp/consul/agent/consul/state/txn_test.go
src/github.com/hashicorp/consul/agent/debug/host.go
src/github.com/hashicorp/consul/agent/debug/host_test.go
src/github.com/hashicorp/consul/agent/exec/exec.go
src/github.com/hashicorp/consul/agent/exec/exec_unix.go
src/github.com/hashicorp/consul/agent/local/state.go
src/github.com/hashicorp/consul/agent/local/testing.go
src/github.com/hashicorp/consul/agent/local/state_test.go
src/github.com/hashicorp/consul/agent/metadata/build.go
src/github.com/hashicorp/consul/agent/metadata/build_test.go
src/github.com/hashicorp/consul/agent/metadata/server.go
src/github.com/hashicorp/consul/agent/metadata/server_internal_test.go
src/github.com/hashicorp/consul/agent/metadata/server_test.go
src/github.com/hashicorp/consul/agent/mock/notify.go
src/github.com/hashicorp/consul/agent/pool/conn.go
src/github.com/hashicorp/consul/agent/pool/pool.go
src/github.com/hashicorp/consul/agent/proxycfg/manager.go
src/github.com/hashicorp/consul/agent/proxycfg/manager_test.go
src/github.com/hashicorp/consul/agent/proxycfg/proxycfg.go
src/github.com/hashicorp/consul/agent/proxycfg/snapshot.go
src/github.com/hashicorp/consul/agent/proxycfg/state.go
src/github.com/hashicorp/consul/agent/proxycfg/state_test.go
src/github.com/hashicorp/consul/agent/proxycfg/testing.go
src/github.com/hashicorp/consul/agent/proxyprocess/daemon.go
src/github.com/hashicorp/consul/agent/proxyprocess/daemon_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/exitstatus_syscall.go
src/github.com/hashicorp/consul/agent/proxyprocess/manager.go
src/github.com/hashicorp/consul/agent/proxyprocess/manager_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/noop.go
src/github.com/hashicorp/consul/agent/proxyprocess/noop_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/process.go
src/github.com/hashicorp/consul/agent/proxyprocess/process_unix.go
src/github.com/hashicorp/consul/agent/proxyprocess/proxy.go
src/github.com/hashicorp/consul/agent/proxyprocess/proxy_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/root.go
src/github.com/hashicorp/consul/agent/proxyprocess/snapshot.go
src/github.com/hashicorp/consul/agent/proxyprocess/test.go
src/github.com/hashicorp/consul/agent/router/manager.go
src/github.com/hashicorp/consul/agent/router/manager_internal_test.go
src/github.com/hashicorp/consul/agent/router/router.go
src/github.com/hashicorp/consul/agent/router/router_test.go
src/github.com/hashicorp/consul/agent/router/serf_adapter.go
src/github.com/hashicorp/consul/agent/router/serf_flooder.go
src/github.com/hashicorp/consul/agent/router/manager_test.go
src/github.com/hashicorp/consul/agent/structs/acl.go
src/github.com/hashicorp/consul/agent/structs/acl_cache.go
src/github.com/hashicorp/consul/agent/structs/acl_cache_test.go
src/github.com/hashicorp/consul/agent/structs/acl_legacy.go
src/github.com/hashicorp/consul/agent/structs/acl_legacy_test.go
src/github.com/hashicorp/consul/agent/structs/acl_test.go
src/github.com/hashicorp/consul/agent/structs/auto_encrypt.go
src/github.com/hashicorp/consul/agent/structs/catalog.go
src/github.com/hashicorp/consul/agent/structs/check_definition.go
src/github.com/hashicorp/consul/agent/structs/check_definition_test.go
src/github.com/hashicorp/consul/agent/structs/check_type.go
src/github.com/hashicorp/consul/agent/structs/config_entry.go
src/github.com/hashicorp/consul/agent/structs/config_entry_test.go
src/github.com/hashicorp/consul/agent/structs/connect.go
src/github.com/hashicorp/consul/agent/structs/connect_ca.go
src/github.com/hashicorp/consul/agent/structs/connect_ca_test.go
src/github.com/hashicorp/consul/agent/structs/connect_proxy_config.go
src/github.com/hashicorp/consul/agent/structs/connect_proxy_config_test.go
src/github.com/hashicorp/consul/agent/structs/connect_test.go
src/github.com/hashicorp/consul/agent/structs/errors.go
src/github.com/hashicorp/consul/agent/structs/intention.go
src/github.com/hashicorp/consul/agent/structs/intention_test.go
src/github.com/hashicorp/consul/agent/structs/operator.go
src/github.com/hashicorp/consul/agent/structs/prepared_query.go
src/github.com/hashicorp/consul/agent/structs/prepared_query_test.go
src/github.com/hashicorp/consul/agent/structs/sanitize_oss.go
src/github.com/hashicorp/consul/agent/structs/service_definition.go
src/github.com/hashicorp/consul/agent/structs/service_definition_test.go
src/github.com/hashicorp/consul/agent/structs/snapshot.go
src/github.com/hashicorp/consul/agent/structs/structs.go
src/github.com/hashicorp/consul/agent/structs/structs_filtering_test.go
src/github.com/hashicorp/consul/agent/structs/structs_test.go
src/github.com/hashicorp/consul/agent/structs/testing_catalog.go
src/github.com/hashicorp/consul/agent/structs/testing_connect_proxy_config.go
src/github.com/hashicorp/consul/agent/structs/testing_intention.go
src/github.com/hashicorp/consul/agent/structs/testing_service_definition.go
src/github.com/hashicorp/consul/agent/structs/txn.go
src/github.com/hashicorp/consul/agent/systemd/notify.go
src/github.com/hashicorp/consul/agent/token/store.go
src/github.com/hashicorp/consul/agent/token/store_test.go
src/github.com/hashicorp/consul/agent/xds/clusters.go
src/github.com/hashicorp/consul/agent/xds/clusters_test.go
src/github.com/hashicorp/consul/agent/xds/config.go
src/github.com/hashicorp/consul/agent/xds/config_test.go
src/github.com/hashicorp/consul/agent/xds/endpoints.go
src/github.com/hashicorp/consul/agent/xds/endpoints_test.go
src/github.com/hashicorp/consul/agent/xds/golden_test.go
src/github.com/hashicorp/consul/agent/xds/listeners.go
src/github.com/hashicorp/consul/agent/xds/listeners_test.go
src/github.com/hashicorp/consul/agent/xds/response.go
src/github.com/hashicorp/consul/agent/xds/routes.go
src/github.com/hashicorp/consul/agent/xds/server.go
src/github.com/hashicorp/consul/agent/xds/server_test.go
src/github.com/hashicorp/consul/agent/xds/testing.go
src/github.com/hashicorp/consul/agent/xds/xds.go
src/github.com/hashicorp/consul/api/acl.go
src/github.com/hashicorp/consul/api/acl_test.go
src/github.com/hashicorp/consul/api/agent.go
src/github.com/hashicorp/consul/api/agent_test.go
src/github.com/hashicorp/consul/api/api.go
src/github.com/hashicorp/consul/api/api_test.go
src/github.com/hashicorp/consul/api/catalog.go
src/github.com/hashicorp/consul/api/catalog_test.go
src/github.com/hashicorp/consul/api/config_entry.go
src/github.com/hashicorp/consul/api/config_entry_test.go
src/github.com/hashicorp/consul/api/connect.go
src/github.com/hashicorp/consul/api/connect_ca.go
src/github.com/hashicorp/consul/api/connect_ca_test.go
src/github.com/hashicorp/consul/api/connect_intention.go
src/github.com/hashicorp/consul/api/connect_intention_test.go
src/github.com/hashicorp/consul/api/coordinate.go
src/github.com/hashicorp/consul/api/coordinate_test.go
src/github.com/hashicorp/consul/api/debug.go
src/github.com/hashicorp/consul/api/debug_test.go
src/github.com/hashicorp/consul/api/event.go
src/github.com/hashicorp/consul/api/event_test.go
src/github.com/hashicorp/consul/api/health.go
src/github.com/hashicorp/consul/api/health_test.go
src/github.com/hashicorp/consul/api/kv.go
src/github.com/hashicorp/consul/api/kv_test.go
src/github.com/hashicorp/consul/api/lock.go
src/github.com/hashicorp/consul/api/lock_test.go
src/github.com/hashicorp/consul/api/operator.go
src/github.com/hashicorp/consul/api/operator_area.go
src/github.com/hashicorp/consul/api/operator_autopilot.go
src/github.com/hashicorp/consul/api/operator_autopilot_test.go
src/github.com/hashicorp/consul/api/operator_keyring.go
src/github.com/hashicorp/consul/api/operator_keyring_test.go
src/github.com/hashicorp/consul/api/operator_raft.go
src/github.com/hashicorp/consul/api/operator_raft_test.go
src/github.com/hashicorp/consul/api/operator_segment.go
src/github.com/hashicorp/consul/api/prepared_query.go
src/github.com/hashicorp/consul/api/prepared_query_test.go
src/github.com/hashicorp/consul/api/raw.go
src/github.com/hashicorp/consul/api/semaphore.go
src/github.com/hashicorp/consul/api/semaphore_test.go
src/github.com/hashicorp/consul/api/session.go
src/github.com/hashicorp/consul/api/session_test.go
src/github.com/hashicorp/consul/api/snapshot.go
src/github.com/hashicorp/consul/api/snapshot_test.go
src/github.com/hashicorp/consul/api/status.go
src/github.com/hashicorp/consul/api/status_test.go
src/github.com/hashicorp/consul/api/txn.go
src/github.com/hashicorp/consul/api/txn_test.go
src/github.com/hashicorp/consul/api/watch/funcs.go
src/github.com/hashicorp/consul/api/watch/plan.go
src/github.com/hashicorp/consul/api/watch/plan_test.go
src/github.com/hashicorp/consul/api/watch/watch.go
src/github.com/hashicorp/consul/api/watch/watch_test.go
src/github.com/hashicorp/consul/api/watch/funcs_test.go
src/github.com/hashicorp/consul/command/commands_oss.go
src/github.com/hashicorp/consul/command/registry.go
src/github.com/hashicorp/consul/command/acl/acl.go
src/github.com/hashicorp/consul/command/acl/acl_helpers.go
src/github.com/hashicorp/consul/command/acl/agenttokens/agent_tokens.go
src/github.com/hashicorp/consul/command/acl/agenttokens/agent_tokens_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/authmethod.go
src/github.com/hashicorp/consul/command/acl/authmethod/create/authmethod_create.go
src/github.com/hashicorp/consul/command/acl/authmethod/create/authmethod_create_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/delete/authmethod_delete.go
src/github.com/hashicorp/consul/command/acl/authmethod/delete/authmethod_delete_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/list/authmethod_list.go
src/github.com/hashicorp/consul/command/acl/authmethod/list/authmethod_list_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/read/authmethod_read.go
src/github.com/hashicorp/consul/command/acl/authmethod/read/authmethod_read_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/update/authmethod_update.go
src/github.com/hashicorp/consul/command/acl/authmethod/update/authmethod_update_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/bindingrule.go
src/github.com/hashicorp/consul/command/acl/bindingrule/create/bindingrule_create.go
src/github.com/hashicorp/consul/command/acl/bindingrule/create/bindingrule_create_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/delete/bindingrule_delete.go
src/github.com/hashicorp/consul/command/acl/bindingrule/delete/bindingrule_delete_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/list/bindingrule_list.go
src/github.com/hashicorp/consul/command/acl/bindingrule/list/bindingrule_list_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/read/bindingrule_read.go
src/github.com/hashicorp/consul/command/acl/bindingrule/read/bindingrule_read_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/update/bindingrule_update.go
src/github.com/hashicorp/consul/command/acl/bindingrule/update/bindingrule_update_test.go
src/github.com/hashicorp/consul/command/acl/bootstrap/bootstrap.go
src/github.com/hashicorp/consul/command/acl/bootstrap/bootstrap_test.go
src/github.com/hashicorp/consul/command/acl/policy/policy.go
src/github.com/hashicorp/consul/command/acl/policy/create/policy_create.go
src/github.com/hashicorp/consul/command/acl/policy/create/policy_create_test.go
src/github.com/hashicorp/consul/command/acl/policy/delete/policy_delete.go
src/github.com/hashicorp/consul/command/acl/policy/delete/policy_delete_test.go
src/github.com/hashicorp/consul/command/acl/policy/list/policy_list.go
src/github.com/hashicorp/consul/command/acl/policy/list/policy_list_test.go
src/github.com/hashicorp/consul/command/acl/policy/read/policy_read.go
src/github.com/hashicorp/consul/command/acl/policy/read/policy_read_test.go
src/github.com/hashicorp/consul/command/acl/policy/update/policy_update.go
src/github.com/hashicorp/consul/command/acl/policy/update/policy_update_test.go
src/github.com/hashicorp/consul/command/acl/role/role.go
src/github.com/hashicorp/consul/command/acl/role/create/role_create.go
src/github.com/hashicorp/consul/command/acl/role/create/role_create_test.go
src/github.com/hashicorp/consul/command/acl/role/delete/role_delete.go
src/github.com/hashicorp/consul/command/acl/role/delete/role_delete_test.go
src/github.com/hashicorp/consul/command/acl/role/list/role_list.go
src/github.com/hashicorp/consul/command/acl/role/list/role_list_test.go
src/github.com/hashicorp/consul/command/acl/role/read/role_read.go
src/github.com/hashicorp/consul/command/acl/role/read/role_read_test.go
src/github.com/hashicorp/consul/command/acl/role/update/role_update.go
src/github.com/hashicorp/consul/command/acl/role/update/role_update_test.go
src/github.com/hashicorp/consul/command/acl/rules/translate.go
src/github.com/hashicorp/consul/command/acl/rules/translate_test.go
src/github.com/hashicorp/consul/command/acl/token/token.go
src/github.com/hashicorp/consul/command/acl/token/clone/token_clone.go
src/github.com/hashicorp/consul/command/acl/token/clone/token_clone_test.go
src/github.com/hashicorp/consul/command/acl/token/create/token_create.go
src/github.com/hashicorp/consul/command/acl/token/create/token_create_test.go
src/github.com/hashicorp/consul/command/acl/token/delete/token_delete.go
src/github.com/hashicorp/consul/command/acl/token/delete/token_delete_test.go
src/github.com/hashicorp/consul/command/acl/token/list/token_list.go
src/github.com/hashicorp/consul/command/acl/token/list/token_list_test.go
src/github.com/hashicorp/consul/command/acl/token/read/token_read.go
src/github.com/hashicorp/consul/command/acl/token/read/token_read_test.go
src/github.com/hashicorp/consul/command/acl/token/update/token_update.go
src/github.com/hashicorp/consul/command/acl/token/update/token_update_test.go
src/github.com/hashicorp/consul/command/agent/agent.go
src/github.com/hashicorp/consul/command/agent/agent_test.go
src/github.com/hashicorp/consul/command/catalog/catalog.go
src/github.com/hashicorp/consul/command/catalog/catalog_test.go
src/github.com/hashicorp/consul/command/catalog/list/dc/catalog_list_datacenters.go
src/github.com/hashicorp/consul/command/catalog/list/dc/catalog_list_datacenters_test.go
src/github.com/hashicorp/consul/command/catalog/list/nodes/catalog_list_nodes.go
src/github.com/hashicorp/consul/command/catalog/list/nodes/catalog_list_nodes_test.go
src/github.com/hashicorp/consul/command/catalog/list/services/catalog_list_services.go
src/github.com/hashicorp/consul/command/catalog/list/services/catalog_list_services_test.go
src/github.com/hashicorp/consul/command/config/config.go
src/github.com/hashicorp/consul/command/config/delete/config_delete.go
src/github.com/hashicorp/consul/command/config/delete/config_delete_test.go
src/github.com/hashicorp/consul/command/config/list/config_list.go
src/github.com/hashicorp/consul/command/config/list/config_list_test.go
src/github.com/hashicorp/consul/command/config/read/config_read.go
src/github.com/hashicorp/consul/command/config/read/config_read_test.go
src/github.com/hashicorp/consul/command/config/write/config_write.go
src/github.com/hashicorp/consul/command/config/write/config_write_test.go
src/github.com/hashicorp/consul/command/connect/connect.go
src/github.com/hashicorp/consul/command/connect/connect_test.go
src/github.com/hashicorp/consul/command/connect/ca/ca.go
src/github.com/hashicorp/consul/command/connect/ca/ca_test.go
src/github.com/hashicorp/consul/command/connect/ca/get/connect_ca_get.go
src/github.com/hashicorp/consul/command/connect/ca/get/connect_ca_get_test.go
src/github.com/hashicorp/consul/command/connect/ca/set/connect_ca_set.go
src/github.com/hashicorp/consul/command/connect/ca/set/connect_ca_set_test.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_config.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_config_test.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_tpl.go
src/github.com/hashicorp/consul/command/connect/envoy/envoy.go
src/github.com/hashicorp/consul/command/connect/envoy/envoy_test.go
src/github.com/hashicorp/consul/command/connect/envoy/exec_test.go
src/github.com/hashicorp/consul/command/connect/envoy/exec_unix.go
src/github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap/connect_envoy_pipe-bootstrap.go
src/github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap/connect_envoy_pipe-bootstrap_test.go
src/github.com/hashicorp/consul/command/connect/proxy/flag_upstreams.go
src/github.com/hashicorp/consul/command/connect/proxy/flag_upstreams_test.go
src/github.com/hashicorp/consul/command/connect/proxy/proxy.go
src/github.com/hashicorp/consul/command/connect/proxy/proxy_test.go
src/github.com/hashicorp/consul/command/connect/proxy/register.go
src/github.com/hashicorp/consul/command/connect/proxy/register_test.go
src/github.com/hashicorp/consul/command/debug/debug.go
src/github.com/hashicorp/consul/command/debug/debug_test.go
src/github.com/hashicorp/consul/command/event/event.go
src/github.com/hashicorp/consul/command/event/event_test.go
src/github.com/hashicorp/consul/command/exec/exec.go
src/github.com/hashicorp/consul/command/exec/exec_test.go
src/github.com/hashicorp/consul/command/flags/config.go
src/github.com/hashicorp/consul/command/flags/config_test.go
src/github.com/hashicorp/consul/command/flags/flag_map_value.go
src/github.com/hashicorp/consul/command/flags/flag_map_value_test.go
src/github.com/hashicorp/consul/command/flags/flag_slice_value.go
src/github.com/hashicorp/consul/command/flags/flag_slice_value_test.go
src/github.com/hashicorp/consul/command/flags/http.go
src/github.com/hashicorp/consul/command/flags/http_test.go
src/github.com/hashicorp/consul/command/flags/merge.go
src/github.com/hashicorp/consul/command/flags/usage.go
src/github.com/hashicorp/consul/command/forceleave/forceleave.go
src/github.com/hashicorp/consul/command/forceleave/forceleave_test.go
src/github.com/hashicorp/consul/command/helpers/helpers.go
src/github.com/hashicorp/consul/command/info/info.go
src/github.com/hashicorp/consul/command/info/info_test.go
src/github.com/hashicorp/consul/command/intention/intention.go
src/github.com/hashicorp/consul/command/intention/intention_test.go
src/github.com/hashicorp/consul/command/intention/check/check.go
src/github.com/hashicorp/consul/command/intention/check/check_test.go
src/github.com/hashicorp/consul/command/intention/create/create.go
src/github.com/hashicorp/consul/command/intention/create/create_test.go
src/github.com/hashicorp/consul/command/intention/delete/delete.go
src/github.com/hashicorp/consul/command/intention/delete/delete_test.go
src/github.com/hashicorp/consul/command/intention/finder/finder.go
src/github.com/hashicorp/consul/command/intention/finder/finder_test.go
src/github.com/hashicorp/consul/command/intention/get/get.go
src/github.com/hashicorp/consul/command/intention/get/get_test.go
src/github.com/hashicorp/consul/command/intention/match/match.go
src/github.com/hashicorp/consul/command/intention/match/match_test.go
src/github.com/hashicorp/consul/command/join/join.go
src/github.com/hashicorp/consul/command/join/join_test.go
src/github.com/hashicorp/consul/command/keygen/keygen.go
src/github.com/hashicorp/consul/command/keygen/keygen_test.go
src/github.com/hashicorp/consul/command/keyring/keyring.go
src/github.com/hashicorp/consul/command/keyring/keyring_test.go
src/github.com/hashicorp/consul/command/kv/kv.go
src/github.com/hashicorp/consul/command/kv/kv_test.go
src/github.com/hashicorp/consul/command/kv/del/kv_delete.go
src/github.com/hashicorp/consul/command/kv/del/kv_delete_test.go
src/github.com/hashicorp/consul/command/kv/exp/kv_export.go
src/github.com/hashicorp/consul/command/kv/exp/kv_export_test.go
src/github.com/hashicorp/consul/command/kv/get/kv_get.go
src/github.com/hashicorp/consul/command/kv/get/kv_get_test.go
src/github.com/hashicorp/consul/command/kv/imp/kv_import.go
src/github.com/hashicorp/consul/command/kv/imp/kv_import_test.go
src/github.com/hashicorp/consul/command/kv/impexp/kvimpexp.go
src/github.com/hashicorp/consul/command/kv/put/kv_put.go
src/github.com/hashicorp/consul/command/kv/put/kv_put_test.go
src/github.com/hashicorp/consul/command/leave/leave.go
src/github.com/hashicorp/consul/command/leave/leave_test.go
src/github.com/hashicorp/consul/command/lock/lock.go
src/github.com/hashicorp/consul/command/lock/lock_test.go
src/github.com/hashicorp/consul/command/lock/util_unix.go
src/github.com/hashicorp/consul/command/login/login.go
src/github.com/hashicorp/consul/command/login/login_test.go
src/github.com/hashicorp/consul/command/logout/logout.go
src/github.com/hashicorp/consul/command/logout/logout_test.go
src/github.com/hashicorp/consul/command/maint/maint.go
src/github.com/hashicorp/consul/command/maint/maint_test.go
src/github.com/hashicorp/consul/command/members/members.go
src/github.com/hashicorp/consul/command/members/members_test.go
src/github.com/hashicorp/consul/command/monitor/monitor.go
src/github.com/hashicorp/consul/command/monitor/monitor_test.go
src/github.com/hashicorp/consul/command/operator/operator.go
src/github.com/hashicorp/consul/command/operator/operator_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/operator_autopilot.go
src/github.com/hashicorp/consul/command/operator/autopilot/operator_autopilot_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/get/operator_autopilot_get.go
src/github.com/hashicorp/consul/command/operator/autopilot/get/operator_autopilot_get_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/set/operator_autopilot_set.go
src/github.com/hashicorp/consul/command/operator/autopilot/set/operator_autopilot_set_test.go
src/github.com/hashicorp/consul/command/operator/raft/operator_raft.go
src/github.com/hashicorp/consul/command/operator/raft/operator_raft_test.go
src/github.com/hashicorp/consul/command/operator/raft/listpeers/operator_raft_list.go
src/github.com/hashicorp/consul/command/operator/raft/listpeers/operator_raft_list_test.go
src/github.com/hashicorp/consul/command/operator/raft/removepeer/operator_raft_remove.go
src/github.com/hashicorp/consul/command/operator/raft/removepeer/operator_raft_remove_test.go
src/github.com/hashicorp/consul/command/reload/reload.go
src/github.com/hashicorp/consul/command/reload/reload_test.go
src/github.com/hashicorp/consul/command/rtt/rtt.go
src/github.com/hashicorp/consul/command/rtt/rtt_test.go
src/github.com/hashicorp/consul/command/services/config.go
src/github.com/hashicorp/consul/command/services/config_test.go
src/github.com/hashicorp/consul/command/services/services.go
src/github.com/hashicorp/consul/command/services/services_test.go
src/github.com/hashicorp/consul/command/services/deregister/deregister.go
src/github.com/hashicorp/consul/command/services/deregister/deregister_test.go
src/github.com/hashicorp/consul/command/services/register/register.go
src/github.com/hashicorp/consul/command/services/register/register_test.go
src/github.com/hashicorp/consul/command/snapshot/snapshot_command.go
src/github.com/hashicorp/consul/command/snapshot/snapshot_command_test.go
src/github.com/hashicorp/consul/command/snapshot/inspect/snapshot_inspect.go
src/github.com/hashicorp/consul/command/snapshot/inspect/snapshot_inspect_test.go
src/github.com/hashicorp/consul/command/snapshot/restore/snapshot_restore.go
src/github.com/hashicorp/consul/command/snapshot/restore/snapshot_restore_test.go
src/github.com/hashicorp/consul/command/snapshot/save/snapshot_save.go
src/github.com/hashicorp/consul/command/snapshot/save/snapshot_save_test.go
src/github.com/hashicorp/consul/command/tls/tls.go
src/github.com/hashicorp/consul/command/tls/tls_test.go
src/github.com/hashicorp/consul/command/tls/ca/tls_ca.go
src/github.com/hashicorp/consul/command/tls/ca/tls_ca_test.go
src/github.com/hashicorp/consul/command/tls/ca/create/tls_ca_create.go
src/github.com/hashicorp/consul/command/tls/ca/create/tls_ca_create_test.go
src/github.com/hashicorp/consul/command/tls/cert/tls_cert.go
src/github.com/hashicorp/consul/command/tls/cert/tls_cert_test.go
src/github.com/hashicorp/consul/command/tls/cert/create/tls_cert_create.go
src/github.com/hashicorp/consul/command/tls/cert/create/tls_cert_create_test.go
src/github.com/hashicorp/consul/command/validate/validate.go
src/github.com/hashicorp/consul/command/validate/validate_test.go
src/github.com/hashicorp/consul/command/version/version.go
src/github.com/hashicorp/consul/command/version/version_test.go
src/github.com/hashicorp/consul/command/watch/watch.go
src/github.com/hashicorp/consul/command/watch/watch_test.go
src/github.com/hashicorp/consul/connect/example_test.go
src/github.com/hashicorp/consul/connect/resolver.go
src/github.com/hashicorp/consul/connect/resolver_test.go
src/github.com/hashicorp/consul/connect/service.go
src/github.com/hashicorp/consul/connect/service_test.go
src/github.com/hashicorp/consul/connect/testing.go
src/github.com/hashicorp/consul/connect/tls.go
src/github.com/hashicorp/consul/connect/tls_test.go
src/github.com/hashicorp/consul/connect/certgen/certgen.go
src/github.com/hashicorp/consul/connect/proxy/config.go
src/github.com/hashicorp/consul/connect/proxy/config_test.go
src/github.com/hashicorp/consul/connect/proxy/conn.go
src/github.com/hashicorp/consul/connect/proxy/conn_test.go
src/github.com/hashicorp/consul/connect/proxy/listener.go
src/github.com/hashicorp/consul/connect/proxy/listener_test.go
src/github.com/hashicorp/consul/connect/proxy/proxy.go
src/github.com/hashicorp/consul/connect/proxy/proxy_test.go
src/github.com/hashicorp/consul/connect/proxy/testing.go
src/github.com/hashicorp/consul/ipaddr/detect.go
src/github.com/hashicorp/consul/ipaddr/detect_test.go
src/github.com/hashicorp/consul/ipaddr/ipaddr.go
src/github.com/hashicorp/consul/ipaddr/ipaddr_test.go
src/github.com/hashicorp/consul/lib/cluster.go
src/github.com/hashicorp/consul/lib/cluster_test.go
src/github.com/hashicorp/consul/lib/eof.go
src/github.com/hashicorp/consul/lib/map_walker.go
src/github.com/hashicorp/consul/lib/map_walker_test.go
src/github.com/hashicorp/consul/lib/math.go
src/github.com/hashicorp/consul/lib/path.go
src/github.com/hashicorp/consul/lib/rand.go
src/github.com/hashicorp/consul/lib/retry.go
src/github.com/hashicorp/consul/lib/retry_test.go
src/github.com/hashicorp/consul/lib/rtt.go
src/github.com/hashicorp/consul/lib/rtt_test.go
src/github.com/hashicorp/consul/lib/serf.go
src/github.com/hashicorp/consul/lib/stop_context.go
src/github.com/hashicorp/consul/lib/string.go
src/github.com/hashicorp/consul/lib/string_test.go
src/github.com/hashicorp/consul/lib/telemetry.go
src/github.com/hashicorp/consul/lib/telemetry_test.go
src/github.com/hashicorp/consul/lib/translate.go
src/github.com/hashicorp/consul/lib/translate_test.go
src/github.com/hashicorp/consul/lib/useragent.go
src/github.com/hashicorp/consul/lib/useragent_test.go
src/github.com/hashicorp/consul/lib/uuid.go
src/github.com/hashicorp/consul/lib/math_test.go
src/github.com/hashicorp/consul/lib/file/atomic.go
src/github.com/hashicorp/consul/lib/file/atomic_test.go
src/github.com/hashicorp/consul/lib/semaphore/semaphore.go
src/github.com/hashicorp/consul/lib/semaphore/semaphore_test.go
src/github.com/hashicorp/consul/logger/gated_writer.go
src/github.com/hashicorp/consul/logger/gated_writer_test.go
src/github.com/hashicorp/consul/logger/grpc.go
src/github.com/hashicorp/consul/logger/grpc_test.go
src/github.com/hashicorp/consul/logger/log_levels.go
src/github.com/hashicorp/consul/logger/log_writer.go
src/github.com/hashicorp/consul/logger/log_writer_test.go
src/github.com/hashicorp/consul/logger/logfile.go
src/github.com/hashicorp/consul/logger/logfile_test.go
src/github.com/hashicorp/consul/logger/logger.go
src/github.com/hashicorp/consul/logger/syslog.go
src/github.com/hashicorp/consul/sdk/freeport/freeport.go
src/github.com/hashicorp/consul/sdk/testutil/io.go
src/github.com/hashicorp/consul/sdk/testutil/server.go
src/github.com/hashicorp/consul/sdk/testutil/server_methods.go
src/github.com/hashicorp/consul/sdk/testutil/server_wrapper.go
src/github.com/hashicorp/consul/sdk/testutil/testlog.go
src/github.com/hashicorp/consul/sdk/testutil/retry/retry.go
src/github.com/hashicorp/consul/sdk/testutil/retry/retry_test.go
src/github.com/hashicorp/consul/sentinel/evaluator.go
src/github.com/hashicorp/consul/sentinel/scope.go
src/github.com/hashicorp/consul/sentinel/sentinel_oss.go
src/github.com/hashicorp/consul/service_os/service.go
src/github.com/hashicorp/consul/snapshot/archive.go
src/github.com/hashicorp/consul/snapshot/archive_test.go
src/github.com/hashicorp/consul/snapshot/snapshot.go
src/github.com/hashicorp/consul/snapshot/snapshot_test.go
src/github.com/hashicorp/consul/testrpc/wait.go
src/github.com/hashicorp/consul/tlsutil/config.go
src/github.com/hashicorp/consul/tlsutil/config_test.go
src/github.com/hashicorp/consul/tlsutil/generate.go
src/github.com/hashicorp/consul/tlsutil/generate_test.go
src/github.com/hashicorp/consul/types/area.go
src/github.com/hashicorp/consul/types/checks.go
src/github.com/hashicorp/consul/types/node_id.go
src/github.com/hashicorp/consul/version/version.go
	cd _build && go install -gcflags=all=\"-trimpath=/<<BUILDDIR>>/consul-1.5.2\+dfsg1/_build/src\" -asmflags=all=\"-trimpath=/<<BUILDDIR>>/consul-1.5.2\+dfsg1/_build/src\" -v -p 4 github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/checks github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/connect github.com/hashicorp/consul/agent/connect/ca github.com/hashicorp/consul/agent/connect/ca/plugin github.com/hashicorp/consul/agent/consul github.com/hashicorp/consul/agent/consul/authmethod github.com/hashicorp/consul/agent/consul/authmethod/kubeauth github.com/hashicorp/consul/agent/consul/authmethod/testauth github.com/hashicorp/consul/agent/consul/autopilot github.com/hashicorp/consul/agent/consul/fsm github.com/hashicorp/consul/agent/consul/prepared_query github.com/hashicorp/consul/agent/consul/state github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/api github.com/hashicorp/consul/api/watch github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/tls github.com/hashicorp/consul/command/tls/ca github.com/hashicorp/consul/command/tls/ca/create github.com/hashicorp/consul/command/tls/cert github.com/hashicorp/consul/command/tls/cert/create github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version
internal/cpu
math/bits
runtime/internal/sys
unicode/utf8
runtime/internal/math
internal/race
internal/bytealg
runtime/internal/atomic
sync/atomic
math
unicode
internal/testlog
encoding
unicode/utf16
runtime
container/list
crypto/internal/subtle
crypto/subtle
vendor/golang.org/x/crypto/cryptobyte/asn1
internal/nettrace
runtime/cgo
vendor/golang.org/x/crypto/internal/subtle
github.com/circonus-labs/circonus-gometrics/api/config
golang.org/x/net/internal/iana
github.com/hashicorp/consul/types
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/selection
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/integer
github.com/aws/aws-sdk-go/aws/client/metadata
go.opencensus.io
go.opencensus.io/trace/internal
go.opencensus.io/internal/tagencoding
github.com/hashicorp/consul/service_os
github.com/hashicorp/consul/vendor/github.com/oklog/run
internal/reflectlite
sync
google.golang.org/grpc/internal/grpcsync
internal/singleflight
math/rand
github.com/hashicorp/consul/agent/token
golang.org/x/sync/singleflight
errors
sort
internal/oserror
io
strconv
syscall
vendor/golang.org/x/net/dns/dnsmessage
bytes
strings
reflect
bufio
github.com/armon/go-radix
time
internal/syscall/unix
hash
crypto/internal/randutil
crypto
crypto/hmac
crypto/rc4
vendor/golang.org/x/crypto/hkdf
hash/crc32
vendor/golang.org/x/text/transform
path
github.com/hashicorp/golang-lru/simplelru
github.com/hashicorp/hcl/hcl/strconv
regexp/syntax
github.com/hashicorp/go-immutable-radix
text/tabwriter
container/heap
github.com/beorn7/perks/quantile
context
internal/poll
github.com/prometheus/common/internal/bitbucket.org/ww/goautoneg
html
regexp
os
encoding/base32
hash/crc64
hash/fnv
internal/fmtsort
encoding/binary
github.com/hashicorp/errwrap
github.com/mitchellh/reflectwalk
github.com/mitchellh/copystructure
fmt
path/filepath
encoding/base64
net
crypto/cipher
crypto/sha512
crypto/ed25519/internal/edwards25519
crypto/aes
crypto/des
crypto/md5
encoding/json
math/big
crypto/sha1
crypto/sha256
encoding/hex
encoding/pem
io/ioutil
net/url
vendor/golang.org/x/crypto/internal/chacha20
vendor/golang.org/x/crypto/poly1305
vendor/golang.org/x/crypto/curve25519
vendor/golang.org/x/crypto/chacha20poly1305
compress/flate
log
vendor/golang.org/x/text/unicode/bidi
crypto/elliptic
encoding/asn1
crypto/rand
crypto/dsa
crypto/ed25519
crypto/rsa
compress/gzip
vendor/golang.org/x/text/secure/bidirule
vendor/golang.org/x/text/unicode/norm
vendor/golang.org/x/net/http2/hpack
crypto/ecdsa
crypto/x509/pkix
vendor/golang.org/x/crypto/cryptobyte
mime
mime/quotedprintable
net/http/internal
os/signal
vendor/golang.org/x/net/idna
github.com/hashicorp/hcl/hcl/token
golang.org/x/crypto/blake2b
github.com/hashicorp/hcl/hcl/ast
github.com/hashicorp/hcl/hcl/scanner
github.com/hashicorp/hcl/json/token
github.com/hashicorp/hcl/json/scanner
github.com/hashicorp/hcl/hcl/parser
github.com/hashicorp/hcl/json/parser
github.com/pkg/errors
crypto/x509
net/textproto
vendor/golang.org/x/net/http/httpproxy
github.com/mitchellh/mapstructure
github.com/hashicorp/hcl
vendor/golang.org/x/net/http/httpguts
mime/multipart
github.com/hashicorp/hcl/hcl/printer
github.com/circonus-labs/circonusllhist
github.com/DataDog/datadog-go/statsd
github.com/cespare/xxhash
github.com/golang/protobuf/proto
github.com/prometheus/common/model
github.com/prometheus/procfs/internal/fs
github.com/prometheus/procfs
crypto/tls
runtime/debug
github.com/hashicorp/consul/version
github.com/hashicorp/go-uuid
encoding/gob
go/token
text/template/parse
text/template
compress/lzw
github.com/google/btree
github.com/hashicorp/go-multierror
os/exec
github.com/hashicorp/go-sockaddr
html/template
github.com/prometheus/client_model/go
github.com/prometheus/client_golang/prometheus/internal
github.com/matttproud/golang_protobuf_extensions/pbutil
net/http/httptrace
github.com/hashicorp/go-rootcerts
golang.org/x/crypto/ed25519
golang.org/x/net/bpf
net/http
golang.org/x/sys/unix
text/scanner
github.com/hashicorp/memberlist/vendor/github.com/sean-/seed
github.com/hashicorp/yamux
github.com/mitchellh/go-testing-interface
github.com/davecgh/go-spew/spew
github.com/pmezard/go-difflib/difflib
github.com/stretchr/objx
gopkg.in/yaml.v2
golang.org/x/net/internal/socket
flag
golang.org/x/net/ipv4
golang.org/x/net/ipv6
github.com/hashicorp/go-version
github.com/miekg/dns
github.com/mattn/go-isatty
github.com/mattn/go-colorable
github.com/fatih/color
runtime/trace
testing
github.com/hashicorp/go-hclog
github.com/hashicorp/golang-lru
github.com/mitchellh/hashstructure
github.com/kr/text
github.com/bgentry/speakeasy
os/user
github.com/posener/complete/match
github.com/hashicorp/consul/command/helpers
github.com/armon/circbuf
golang.org/x/net/internal/socks
golang.org/x/net/proxy
github.com/hashicorp/consul/agent/exec
golang.org/x/net/internal/timeseries
google.golang.org/grpc/grpclog
google.golang.org/grpc/connectivity
google.golang.org/grpc/credentials/internal
google.golang.org/grpc/credentials
google.golang.org/grpc/internal
google.golang.org/grpc/metadata
google.golang.org/grpc/serviceconfig
google.golang.org/grpc/resolver
google.golang.org/grpc/balancer
google.golang.org/grpc/balancer/base
github.com/hashicorp/go-cleanhttp
github.com/armon/go-metrics
github.com/hashicorp/go-retryablehttp
github.com/circonus-labs/circonus-gometrics/api
github.com/tv42/httpunix
expvar
github.com/hashicorp/serf/coordinate
github.com/armon/go-metrics/datadog
github.com/prometheus/common/expfmt
github.com/hashicorp/consul/api
github.com/prometheus/client_golang/prometheus
github.com/circonus-labs/circonus-gometrics/checkmgr
github.com/circonus-labs/circonus-gometrics
github.com/armon/go-metrics/circonus
net/rpc
github.com/hashicorp/consul/sentinel
github.com/hashicorp/consul/acl
net/http/httptest
github.com/prometheus/client_golang/prometheus/push
github.com/hashicorp/go-msgpack/codec
github.com/armon/go-metrics/prometheus
github.com/stretchr/testify/assert
github.com/hashicorp/consul/command/flags
github.com/posener/complete/cmd/install
github.com/posener/complete/cmd
github.com/posener/complete
github.com/mitchellh/cli
github.com/stretchr/testify/mock
github.com/stretchr/testify/require
github.com/hashicorp/consul/command/acl/agenttokens
github.com/hashicorp/consul/command/acl/authmethod
github.com/hashicorp/consul/command/acl/authmethod/delete
github.com/hashicorp/memberlist
github.com/hashicorp/raft
github.com/hashicorp/consul/command/acl/bindingrule
github.com/hashicorp/consul/command/acl/policy
github.com/hashicorp/consul/command/acl/role
github.com/hashicorp/consul/command/acl/token
github.com/NYTimes/gziphandler
github.com/hashicorp/consul/vendor/github.com/coredns/coredns/plugin/pkg/dnsutil
github.com/elazarl/go-bindata-assetfs
github.com/docker/go-connections/sockets
golang.org/x/net/trace
google.golang.org/grpc/internal/grpcrand
google.golang.org/grpc/balancer/roundrobin
google.golang.org/grpc/codes
google.golang.org/grpc/encoding
google.golang.org/grpc/encoding/proto
google.golang.org/grpc/internal/backoff
google.golang.org/grpc/internal/balancerload
github.com/golang/protobuf/ptypes/any
github.com/golang/protobuf/ptypes/duration
github.com/golang/protobuf/ptypes/timestamp
github.com/golang/protobuf/ptypes
google.golang.org/grpc/binarylog/grpc_binarylog_v1
google.golang.org/genproto/googleapis/rpc/status
google.golang.org/grpc/status
github.com/hashicorp/serf/serf
google.golang.org/grpc/internal/channelz
google.golang.org/grpc/internal/binarylog
google.golang.org/grpc/internal/envconfig
golang.org/x/text/transform
golang.org/x/text/unicode/bidi
golang.org/x/text/unicode/norm
golang.org/x/net/http2/hpack
golang.org/x/text/secure/bidirule
google.golang.org/grpc/internal/syscall
github.com/hashicorp/consul/lib
github.com/hashicorp/consul/agent/consul/autopilot
google.golang.org/grpc/keepalive
google.golang.org/grpc/peer
google.golang.org/grpc/stats
google.golang.org/grpc/tap
google.golang.org/grpc/naming
github.com/hashicorp/consul/agent/cache
github.com/hashicorp/consul/agent/ae
google.golang.org/grpc/resolver/dns
google.golang.org/grpc/resolver/passthrough
golang.org/x/net/idna
net/http/httputil
golang.org/x/net/context
github.com/hashicorp/hil/ast
github.com/hashicorp/consul/agent/structs
github.com/hashicorp/hil
github.com/hashicorp/go-memdb
golang.org/x/net/http/httpguts
golang.org/x/net/http2
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/hclutil
github.com/golang/snappy
github.com/ryanuber/go-glob
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/strutil
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/parseutil
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/compressutil
golang.org/x/time/rate
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/jsonutil
golang.org/x/crypto/pbkdf2
gopkg.in/square/go-jose.v2/cipher
gopkg.in/square/go-jose.v2/json
github.com/gogo/protobuf/proto
github.com/hashicorp/consul/command/acl
github.com/hashicorp/consul/command/acl/authmethod/create
github.com/hashicorp/consul/command/acl/authmethod/list
github.com/hashicorp/consul/command/acl/authmethod/read
github.com/hashicorp/consul/command/acl/authmethod/update
github.com/hashicorp/consul/command/acl/bindingrule/create
github.com/hashicorp/consul/command/acl/bindingrule/delete
github.com/hashicorp/consul/command/acl/bindingrule/list
github.com/hashicorp/consul/command/acl/bindingrule/read
github.com/hashicorp/consul/command/acl/bindingrule/update
github.com/hashicorp/consul/command/acl/bootstrap
github.com/hashicorp/consul/command/acl/policy/create
github.com/hashicorp/consul/command/acl/policy/delete
github.com/hashicorp/consul/command/acl/policy/list
github.com/hashicorp/consul/command/acl/policy/read
github.com/hashicorp/consul/command/acl/policy/update
github.com/hashicorp/consul/command/acl/role/create
github.com/hashicorp/consul/command/acl/role/delete
github.com/hashicorp/consul/command/acl/role/list
github.com/hashicorp/consul/command/acl/role/read
github.com/hashicorp/consul/command/acl/role/update
github.com/hashicorp/consul/command/acl/rules
github.com/hashicorp/consul/command/acl/token/clone
github.com/hashicorp/consul/command/acl/token/create
github.com/hashicorp/consul/command/acl/token/delete
github.com/hashicorp/consul/command/acl/token/list
github.com/hashicorp/consul/command/acl/token/read
github.com/hashicorp/consul/command/acl/token/update
github.com/hashicorp/consul/agent/connect
google.golang.org/grpc/internal/transport
github.com/hashicorp/consul/agent/consul/prepared_query
github.com/hashicorp/consul/agent/consul/state
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/api
github.com/hashicorp/consul/agent/consul/authmethod
gopkg.in/square/go-jose.v2
google.golang.org/grpc
gopkg.in/square/go-jose.v2/jwt
github.com/hashicorp/consul/agent/connect/ca
github.com/gogo/protobuf/sortkeys
github.com/google/gofuzz
gopkg.in/inf.v0
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/third_party/forked/golang/reflect
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/api/resource
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/fields
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/conversion
github.com/golang/glog
google.golang.org/grpc/health/grpc_health_v1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/sets
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/errors
internal/lazyregexp
go/scanner
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/conversion/queryparams
github.com/hashicorp/consul/agent/checks
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/schema
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/validation/field
go/ast
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/json
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/runtime
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/types
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/intstr
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/validation
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/net
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/wait
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/labels
github.com/googleapis/gnostic/extensions
github.com/googleapis/gnostic/compiler
github.com/gregjones/httpcache
hash/adler32
compress/zlib
github.com/googleapis/gnostic/OpenAPIv2
github.com/ghodss/yaml
github.com/peterbourgon/diskv
go/doc
go/parser
github.com/gregjones/httpcache/diskcache
github.com/modern-go/concurrent
github.com/modern-go/reflect2
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/framer
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/yaml
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/version
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/version
golang.org/x/crypto/ssh/terminal
github.com/hashicorp/consul/vendor/k8s.io/client-go/transport
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/connrotation
github.com/hashicorp/consul/vendor/k8s.io/client-go/tools/metrics
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/cert
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/clock
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/flowcontrol
github.com/hashicorp/consul/agent/consul/fsm
github.com/json-iterator/go
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/watch
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/recognizer
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/apis/meta/v1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/protobuf
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/streaming
github.com/hashicorp/consul/vendor/k8s.io/client-go/tools/clientcmd/api
github.com/hashicorp/consul/agent/metadata
github.com/hashicorp/consul/tlsutil
github.com/hashicorp/net-rpc-msgpackrpc
github.com/hashicorp/consul/agent/pool
github.com/hashicorp/consul/agent/router
github.com/hashicorp/consul/ipaddr
github.com/hashicorp/consul/lib/semaphore
archive/tar
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json
github.com/hashicorp/consul/snapshot
github.com/hashicorp/consul/vendor/github.com/hashicorp/go-bexpr
github.com/boltdb/bolt
github.com/hashicorp/go-sockaddr/template
github.com/shirou/gopsutil/internal/common
github.com/hashicorp/consul/vendor/k8s.io/api/authentication/v1
github.com/hashicorp/consul/vendor/k8s.io/api/core/v1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/api/errors
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/apis/meta/v1/unstructured
github.com/hashicorp/consul/vendor/k8s.io/api/admissionregistration/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/admissionregistration/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning
github.com/hashicorp/consul/vendor/k8s.io/api/authentication/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer
github.com/hashicorp/consul/vendor/k8s.io/api/authorization/v1
github.com/hashicorp/consul/vendor/k8s.io/api/authorization/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/certificates/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/rbac/v1
github.com/hashicorp/consul/vendor/k8s.io/api/rbac/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/rbac/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/scheduling/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/scheduling/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/storage/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/apis/clientauthentication
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/apis/clientauthentication/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/apis/clientauthentication/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/rest/watch
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/apis/meta/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/plugin/pkg/client/auth/exec
github.com/hashicorp/raft-boltdb
github.com/hashicorp/consul/vendor/k8s.io/client-go/rest
github.com/shirou/gopsutil/cpu
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/api/meta
github.com/shirou/gopsutil/disk
github.com/shirou/gopsutil/host
github.com/shirou/gopsutil/mem
github.com/hashicorp/consul/agent/local
github.com/hashicorp/consul/agent/debug
github.com/hashicorp/consul/lib/file
github.com/hashicorp/consul/agent/systemd
github.com/golang/protobuf/protoc-gen-go/descriptor
github.com/gogo/protobuf/protoc-gen-gogo/descriptor
github.com/hashicorp/consul/agent/proxyprocess
github.com/gogo/protobuf/types
github.com/hashicorp/consul/vendor/github.com/envoyproxy/protoc-gen-validate/validate
github.com/gogo/protobuf/gogoproto
net/mail
github.com/gogo/googleapis/google/api
github.com/hashicorp/consul/api/watch
log/syslog
github.com/hashicorp/logutils
github.com/hashicorp/consul/sdk/freeport
github.com/hashicorp/go-syslog
github.com/hashicorp/consul/sdk/testutil/retry
encoding/xml
github.com/hashicorp/consul/logger
github.com/denverdino/aliyungo/util
github.com/aws/aws-sdk-go/aws/awserr
github.com/aws/aws-sdk-go/internal/ini
github.com/denverdino/aliyungo/common
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/type
github.com/gogo/googleapis/google/rpc
github.com/gogo/protobuf/jsonpb
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/core
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/pkg/util
github.com/denverdino/aliyungo/ecs
github.com/aws/aws-sdk-go/internal/shareddefaults
github.com/aws/aws-sdk-go/aws/credentials
github.com/aws/aws-sdk-go/aws/endpoints
github.com/hashicorp/go-discover/provider/aliyun
github.com/aws/aws-sdk-go/internal/sdkio
github.com/jmespath/go-jmespath
github.com/aws/aws-sdk-go/aws/awsutil
github.com/aws/aws-sdk-go/internal/sdkrand
github.com/aws/aws-sdk-go/internal/sdkuri
github.com/aws/aws-sdk-go/aws/credentials/processcreds
golang.org/x/net/context/ctxhttp
golang.org/x/oauth2/internal
golang.org/x/oauth2
cloud.google.com/go/compute/metadata
github.com/aws/aws-sdk-go/aws
golang.org/x/oauth2/jws
golang.org/x/oauth2/jwt
github.com/aws/aws-sdk-go/aws/request
golang.org/x/oauth2/google
google.golang.org/api/googleapi/internal/uritemplates
google.golang.org/api/googleapi
github.com/aws/aws-sdk-go/aws/corehandlers
github.com/aws/aws-sdk-go/aws/client
github.com/aws/aws-sdk-go/private/protocol
github.com/aws/aws-sdk-go/aws/ec2metadata
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/auth
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/cluster
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/endpoint
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/route
github.com/hashicorp/consul/vendor/k8s.io/api/apps/v1
github.com/hashicorp/consul/vendor/k8s.io/api/apps/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/apps/v1beta2
github.com/hashicorp/consul/vendor/k8s.io/api/autoscaling/v1
github.com/hashicorp/consul/vendor/k8s.io/api/autoscaling/v2beta1
github.com/hashicorp/consul/vendor/k8s.io/api/batch/v1
github.com/hashicorp/consul/vendor/k8s.io/api/events/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/extensions/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/batch/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/batch/v2alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/networking/v1
github.com/hashicorp/consul/vendor/k8s.io/api/policy/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/settings/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/storage/v1
github.com/hashicorp/consul/vendor/k8s.io/api/storage/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/tools/reference
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/listener
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/ext_authz/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/accesslog/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/auth/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/auth/v2alpha
github.com/aws/aws-sdk-go/aws/credentials/ec2rolecreds
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/tcp_proxy/v2
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/scheme
github.com/aws/aws-sdk-go/private/protocol/json/jsonutil
github.com/hashicorp/consul/vendor/k8s.io/client-go/discovery
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/admissionregistration/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/admissionregistration/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/apps/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/apps/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/apps/v1beta2
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authentication/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authentication/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authorization/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authorization/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/autoscaling/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/autoscaling/v2beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/batch/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/batch/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/batch/v2alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/certificates/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/core/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/events/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/extensions/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/networking/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/policy/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/rbac/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/rbac/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/rbac/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/scheduling/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/scheduling/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/settings/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/storage/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/storage/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/storage/v1beta1
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/http_connection_manager/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/discovery/v2
github.com/aws/aws-sdk-go/aws/credentials/endpointcreds
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes
github.com/aws/aws-sdk-go/aws/defaults
github.com/aws/aws-sdk-go/private/protocol/rest
github.com/aws/aws-sdk-go/aws/signer/v4
github.com/aws/aws-sdk-go/private/protocol/query/queryutil
github.com/hashicorp/consul/agent/consul/authmethod/kubeauth
github.com/aws/aws-sdk-go/private/protocol/xml/xmlutil
github.com/aws/aws-sdk-go/aws/csm
google.golang.org/api/gensupport
github.com/aws/aws-sdk-go/private/protocol/query
github.com/aws/aws-sdk-go/private/protocol/ec2query
github.com/hashicorp/consul/agent/consul
github.com/aws/aws-sdk-go/service/sts
github.com/aws/aws-sdk-go/service/ec2
google.golang.org/api/internal
google.golang.org/api/option
go.opencensus.io/internal
go.opencensus.io/trace/tracestate
go.opencensus.io/trace
github.com/aws/aws-sdk-go/service/sts/stsiface
github.com/aws/aws-sdk-go/aws/credentials/stscreds
github.com/aws/aws-sdk-go/aws/session
go.opencensus.io/trace/propagation
go.opencensus.io/plugin/ochttp/propagation/b3
go.opencensus.io/resource
go.opencensus.io/metric/metricdata
runtime/pprof
go.opencensus.io/metric/metricproducer
google.golang.org/api/googleapi/transport
google.golang.org/api/transport/http/internal/propagation
github.com/hashicorp/mdns
github.com/hashicorp/go-discover/provider/mdns
go.opencensus.io/tag
github.com/gophercloud/gophercloud
go.opencensus.io/stats/internal
go.opencensus.io/stats
go.opencensus.io/stats/view
go.opencensus.io/plugin/ochttp
github.com/gophercloud/gophercloud/pagination
github.com/gophercloud/gophercloud/openstack/identity/v2/tenants
github.com/gophercloud/gophercloud/openstack/identity/v2/tokens
google.golang.org/api/transport/http
github.com/gophercloud/gophercloud/openstack/identity/v3/tokens
github.com/gophercloud/gophercloud/openstack/utils
google.golang.org/api/compute/v1
github.com/gophercloud/gophercloud/openstack
github.com/gophercloud/gophercloud/openstack/compute/v2/flavors
github.com/hashicorp/consul/agent/cache-types
github.com/hashicorp/consul/agent/config
github.com/hashicorp/consul/agent/proxycfg
github.com/hashicorp/consul/agent/xds
github.com/gophercloud/gophercloud/openstack/compute/v2/images
github.com/gophercloud/gophercloud/openstack/compute/v2/servers
github.com/packethost/packngo
github.com/hashicorp/go-discover/provider/os
github.com/imdario/mergo
github.com/prometheus/client_golang/prometheus/promhttp
github.com/hashicorp/go-discover/provider/packet
net/http/pprof
github.com/hashicorp/go-checkpoint
github.com/hashicorp/consul/command/catalog
github.com/hashicorp/consul/command/catalog/list/dc
github.com/ryanuber/columnize
github.com/hashicorp/consul/command/catalog/list/nodes
github.com/hashicorp/consul/command/catalog/list/services
github.com/hashicorp/consul/command/config
github.com/hashicorp/consul/command/config/delete
github.com/hashicorp/consul/command/config/list
github.com/hashicorp/consul/command/config/read
github.com/hashicorp/consul/command/config/write
github.com/hashicorp/consul/command/connect
github.com/hashicorp/consul/command/connect/ca
github.com/hashicorp/consul/command/connect/ca/get
github.com/hashicorp/consul/command/connect/ca/set
github.com/hashicorp/consul/connect
github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap
github.com/hashicorp/consul/command/debug
github.com/hashicorp/consul/command/event
github.com/hashicorp/consul/connect/proxy
github.com/hashicorp/consul/command/exec
github.com/hashicorp/consul/command/forceleave
github.com/hashicorp/consul/command/connect/proxy
github.com/hashicorp/consul/command/info
github.com/hashicorp/consul/command/intention
github.com/hashicorp/consul/command/intention/check
github.com/hashicorp/consul/command/connect/envoy
github.com/hashicorp/consul/command/intention/finder
github.com/hashicorp/consul/command/intention/create
github.com/hashicorp/consul/command/intention/delete
github.com/hashicorp/consul/command/intention/get
github.com/hashicorp/consul/command/intention/match
github.com/hashicorp/consul/command/join
github.com/hashicorp/consul/command/keygen
github.com/hashicorp/consul/command/kv
github.com/hashicorp/consul/command/kv/del
github.com/hashicorp/consul/command/kv/impexp
github.com/hashicorp/consul/command/kv/exp
github.com/hashicorp/consul/command/kv/get
github.com/hashicorp/consul/command/kv/imp
github.com/hashicorp/consul/command/kv/put
github.com/hashicorp/consul/command/leave
github.com/hashicorp/consul/command/login
github.com/hashicorp/consul/command/logout
github.com/hashicorp/consul/command/maint
github.com/hashicorp/consul/command/members
github.com/hashicorp/consul/command/monitor
github.com/hashicorp/consul/command/operator
github.com/hashicorp/consul/command/operator/autopilot
github.com/hashicorp/consul/command/operator/autopilot/get
github.com/hashicorp/consul/command/operator/autopilot/set
github.com/hashicorp/consul/command/operator/raft
github.com/hashicorp/consul/command/operator/raft/listpeers
github.com/hashicorp/consul/command/operator/raft/removepeer
github.com/hashicorp/consul/command/reload
github.com/hashicorp/consul/command/rtt
github.com/hashicorp/consul/command/services
github.com/hashicorp/consul/command/services/deregister
github.com/hashicorp/consul/command/services/register
github.com/hashicorp/consul/command/snapshot
github.com/hashicorp/consul/command/snapshot/inspect
github.com/hashicorp/consul/command/snapshot/restore
github.com/hashicorp/consul/command/snapshot/save
github.com/hashicorp/consul/command/tls
github.com/hashicorp/consul/command/tls/ca
github.com/hashicorp/consul/command/tls/ca/create
github.com/hashicorp/consul/command/tls/cert
github.com/hashicorp/consul/command/tls/cert/create
github.com/hashicorp/consul/command/validate
github.com/hashicorp/consul/command/version
google.golang.org/grpc/health
github.com/hashicorp/consul/agent/consul/authmethod/testauth
github.com/hashicorp/consul/agent/mock
github.com/hashicorp/consul/vendor/github.com/hashicorp/go-plugin
github.com/hashicorp/consul/connect/certgen
github.com/hashicorp/consul/agent/connect/ca/plugin
github.com/hashicorp/consul/sdk/testutil
github.com/hashicorp/consul/testrpc
github.com/hashicorp/go-discover/provider/aws
github.com/hashicorp/go-discover/provider/gce
github.com/hashicorp/go-discover
github.com/hashicorp/consul/agent
github.com/hashicorp/consul/command/watch
github.com/hashicorp/consul/command/agent
github.com/hashicorp/consul/command/lock
github.com/hashicorp/consul/command/keyring
github.com/hashicorp/consul/command
github.com/hashicorp/consul
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
   debian/rules override_dh_auto_test
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
PATH="/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/bin:${PATH}" \
        DH_GOLANG_EXCLUDES="test/integration api agent/cache agent/checks agent/connect agent/consul command/tls" \
        dh_auto_test -v --max-parallel=4 -- -short -failfast -timeout 7m
	cd _build && go test -vet=off -v -p 4 -short -failfast -timeout 7m github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version
testing: warning: no tests to run
PASS
ok  	github.com/hashicorp/consul	0.350s [no tests to run]
=== RUN   TestACL
=== RUN   TestACL/DenyAll
=== RUN   TestACL/DenyAll/DenyACLRead
=== RUN   TestACL/DenyAll/DenyACLWrite
=== RUN   TestACL/DenyAll/DenyAgentRead
=== RUN   TestACL/DenyAll/DenyAgentWrite
=== RUN   TestACL/DenyAll/DenyEventRead
=== RUN   TestACL/DenyAll/DenyEventWrite
=== RUN   TestACL/DenyAll/DenyIntentionDefaultAllow
=== RUN   TestACL/DenyAll/DenyIntentionRead
=== RUN   TestACL/DenyAll/DenyIntentionWrite
=== RUN   TestACL/DenyAll/DenyKeyRead
=== RUN   TestACL/DenyAll/DenyKeyringRead
=== RUN   TestACL/DenyAll/DenyKeyringWrite
=== RUN   TestACL/DenyAll/DenyKeyWrite
=== RUN   TestACL/DenyAll/DenyNodeRead
=== RUN   TestACL/DenyAll/DenyNodeWrite
=== RUN   TestACL/DenyAll/DenyOperatorRead
=== RUN   TestACL/DenyAll/DenyOperatorWrite
=== RUN   TestACL/DenyAll/DenyPreparedQueryRead
=== RUN   TestACL/DenyAll/DenyPreparedQueryWrite
=== RUN   TestACL/DenyAll/DenyServiceRead
=== RUN   TestACL/DenyAll/DenyServiceWrite
=== RUN   TestACL/DenyAll/DenySessionRead
=== RUN   TestACL/DenyAll/DenySessionWrite
=== RUN   TestACL/DenyAll/DenySnapshot
=== RUN   TestACL/AllowAll
=== RUN   TestACL/AllowAll/DenyACLRead
=== RUN   TestACL/AllowAll/DenyACLWrite
=== RUN   TestACL/AllowAll/AllowAgentRead
=== RUN   TestACL/AllowAll/AllowAgentWrite
=== RUN   TestACL/AllowAll/AllowEventRead
=== RUN   TestACL/AllowAll/AllowEventWrite
=== RUN   TestACL/AllowAll/AllowIntentionDefaultAllow
=== RUN   TestACL/AllowAll/AllowIntentionRead
=== RUN   TestACL/AllowAll/AllowIntentionWrite
=== RUN   TestACL/AllowAll/AllowKeyRead
=== RUN   TestACL/AllowAll/AllowKeyringRead
=== RUN   TestACL/AllowAll/AllowKeyringWrite
=== RUN   TestACL/AllowAll/AllowKeyWrite
=== RUN   TestACL/AllowAll/AllowNodeRead
=== RUN   TestACL/AllowAll/AllowNodeWrite
=== RUN   TestACL/AllowAll/AllowOperatorRead
=== RUN   TestACL/AllowAll/AllowOperatorWrite
=== RUN   TestACL/AllowAll/AllowPreparedQueryRead
=== RUN   TestACL/AllowAll/AllowPreparedQueryWrite
=== RUN   TestACL/AllowAll/AllowServiceRead
=== RUN   TestACL/AllowAll/AllowServiceWrite
=== RUN   TestACL/AllowAll/AllowSessionRead
=== RUN   TestACL/AllowAll/AllowSessionWrite
=== RUN   TestACL/AllowAll/DenySnapshot
=== RUN   TestACL/ManageAll
=== RUN   TestACL/ManageAll/AllowACLRead
=== RUN   TestACL/ManageAll/AllowACLWrite
=== RUN   TestACL/ManageAll/AllowAgentRead
=== RUN   TestACL/ManageAll/AllowAgentWrite
=== RUN   TestACL/ManageAll/AllowEventRead
=== RUN   TestACL/ManageAll/AllowEventWrite
=== RUN   TestACL/ManageAll/AllowIntentionDefaultAllow
=== RUN   TestACL/ManageAll/AllowIntentionRead
=== RUN   TestACL/ManageAll/AllowIntentionWrite
=== RUN   TestACL/ManageAll/AllowKeyRead
=== RUN   TestACL/ManageAll/AllowKeyringRead
=== RUN   TestACL/ManageAll/AllowKeyringWrite
=== RUN   TestACL/ManageAll/AllowKeyWrite
=== RUN   TestACL/ManageAll/AllowNodeRead
=== RUN   TestACL/ManageAll/AllowNodeWrite
=== RUN   TestACL/ManageAll/AllowOperatorRead
=== RUN   TestACL/ManageAll/AllowOperatorWrite
=== RUN   TestACL/ManageAll/AllowPreparedQueryRead
=== RUN   TestACL/ManageAll/AllowPreparedQueryWrite
=== RUN   TestACL/ManageAll/AllowServiceRead
=== RUN   TestACL/ManageAll/AllowServiceWrite
=== RUN   TestACL/ManageAll/AllowSessionRead
=== RUN   TestACL/ManageAll/AllowSessionWrite
=== RUN   TestACL/ManageAll/AllowSnapshot
=== RUN   TestACL/AgentBasicDefaultDeny
=== RUN   TestACL/AgentBasicDefaultDeny/DefaultReadDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultDeny/DefaultWriteDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultDeny/ROReadAllowed.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultDeny/ROWriteDenied.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultDeny/RWWriteDenied.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultAllow
=== RUN   TestACL/AgentBasicDefaultAllow/DefaultReadDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultAllow/DefaultWriteDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultAllow/ROReadAllowed.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultAllow/ROWriteDenied.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultAllow/RWWriteDenied.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-sub)
=== RUN   TestACL/PreparedQueryDefaultAllow
=== RUN   TestACL/PreparedQueryDefaultAllow/ReadAllowed.Prefix(foo)
=== RUN   TestACL/PreparedQueryDefaultAllow/WriteAllowed.Prefix(foo)
=== RUN   TestACL/PreparedQueryDefaultAllow/ReadDenied.Prefix(other)
=== RUN   TestACL/PreparedQueryDefaultAllow/WriteDenied.Prefix(other)
=== RUN   TestACL/AgentNestedDefaultDeny
=== RUN   TestACL/AgentNestedDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultAllow
=== RUN   TestACL/AgentNestedDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny/ReadDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny/WriteDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyRead
=== RUN   TestACL/KeyringDefaultAllowPolicyRead/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyRead/WriteDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite/WriteAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyNone
=== RUN   TestACL/KeyringDefaultAllowPolicyNone/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyNone/WriteAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny/ReadDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny/WriteDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyRead
=== RUN   TestACL/KeyringDefaultDenyPolicyRead/ReadAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyRead/WriteDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite/ReadAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite/WriteAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyNone
=== RUN   TestACL/KeyringDefaultDenyPolicyNone/ReadDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyNone/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny/ReadDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyRead
=== RUN   TestACL/OperatorDefaultAllowPolicyRead/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyRead/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite/WriteAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyNone
=== RUN   TestACL/OperatorDefaultAllowPolicyNone/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyNone/WriteAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny/ReadDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny/WriteDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyRead
=== RUN   TestACL/OperatorDefaultDenyPolicyRead/ReadAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyRead/WriteDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite/ReadAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite/WriteAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyNone
=== RUN   TestACL/OperatorDefaultDenyPolicyNone/ReadDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyNone/WriteDenied
=== RUN   TestACL/NodeDefaultDeny
=== RUN   TestACL/NodeDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/NodeDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/NodeDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultAllow
=== RUN   TestACL/NodeDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/NodeDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/NodeDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultDeny
=== RUN   TestACL/SessionDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/SessionDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/SessionDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultAllow
=== RUN   TestACL/SessionDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/SessionDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/SessionDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/Parent
=== RUN   TestACL/Parent/KeyReadDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyWriteAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyWritePrefixAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyReadDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(zip/test)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(zip/test)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(zip/test)
=== RUN   TestACL/Parent/ServiceReadDenied.Prefix(fail)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(fail)
=== RUN   TestACL/Parent/ServiceReadAllowed.Prefix(other)
=== RUN   TestACL/Parent/ServiceWriteAllowed.Prefix(other)
=== RUN   TestACL/Parent/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(foo)
=== RUN   TestACL/Parent/ServiceReadDenied.Prefix(bar)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(foo)
=== RUN   TestACL/Parent/PreparedQueryReadAllowed.Prefix(foobar)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(foobar)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(barbaz)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(barbaz)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(baz)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(baz)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(nope)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(nope)
=== RUN   TestACL/Parent/ACLReadDenied
=== RUN   TestACL/Parent/ACLWriteDenied
=== RUN   TestACL/Parent/SnapshotDenied
=== RUN   TestACL/Parent/IntentionDefaultAllowDenied
=== RUN   TestACL/ComplexDefaultAllow
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(intbaz)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(intbaz)
=== RUN   TestACL/ComplexDefaultAllow/IntentionDefaultAllowAllowed
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(nope)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(nope)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zoo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zoo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zookeeper)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zookeeper)
=== RUN   TestACL/ExactMatchPrecedence
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/AgentWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/AgentWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/KeyWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/KeyWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)#01
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/SessionWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/SessionWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/EventReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/EventWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/EventReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/EventWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWriteDenied.Prefix(football)
=== RUN   TestACL/ACLRead
=== RUN   TestACL/ACLRead/ReadAllowed
=== RUN   TestACL/ACLRead/WriteDenied
=== RUN   TestACL/ACLRead#01
=== RUN   TestACL/ACLRead#01/ReadAllowed
=== RUN   TestACL/ACLRead#01/WriteAllowed
=== RUN   TestACL/KeyWritePrefixDefaultDeny
=== RUN   TestACL/KeyWritePrefixDefaultDeny/DeniedTopLevelPrefix.Prefix(foo)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/AllowedTopLevelPrefix.Prefix(baz/)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/AllowedPrefixWithNestedWrite.Prefix(foo/)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/DenyPrefixWithNestedRead.Prefix(bar/)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/DenyNoPrefixMatch.Prefix(te)
=== RUN   TestACL/KeyWritePrefixDefaultAllow
=== RUN   TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixDenied.Prefix(foo)
=== RUN   TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixAllowed.Prefix(bar)
--- PASS: TestACL (0.39s)
    --- PASS: TestACL/DenyAll (0.02s)
        --- PASS: TestACL/DenyAll/DenyACLRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyACLWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyAgentRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyAgentWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyEventRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyEventWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyringRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyringWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyNodeRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyNodeWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyOperatorRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyOperatorWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyPreparedQueryRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyPreparedQueryWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyServiceRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyServiceWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenySessionRead (0.00s)
        --- PASS: TestACL/DenyAll/DenySessionWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenySnapshot (0.00s)
    --- PASS: TestACL/AllowAll (0.02s)
        --- PASS: TestACL/AllowAll/DenyACLRead (0.00s)
        --- PASS: TestACL/AllowAll/DenyACLWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowAgentRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowAgentWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowEventRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowEventWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyringRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyringWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowNodeRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowNodeWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowOperatorRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowOperatorWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowPreparedQueryRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowPreparedQueryWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowServiceRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowServiceWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowSessionRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowSessionWrite (0.00s)
        --- PASS: TestACL/AllowAll/DenySnapshot (0.00s)
    --- PASS: TestACL/ManageAll (0.02s)
        --- PASS: TestACL/ManageAll/AllowACLRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowACLWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowAgentRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowAgentWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowEventRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowEventWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyringRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyringWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowNodeRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowNodeWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowOperatorRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowOperatorWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowPreparedQueryRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowPreparedQueryWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowServiceRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowServiceWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowSessionRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowSessionWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowSnapshot (0.00s)
    --- PASS: TestACL/AgentBasicDefaultDeny (0.01s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DefaultReadDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DefaultWriteDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROReadAllowed.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROWriteDenied.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWWriteDenied.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-sub) (0.00s)
    --- PASS: TestACL/AgentBasicDefaultAllow (0.01s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DefaultReadDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DefaultWriteDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROReadAllowed.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROWriteDenied.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWWriteDenied.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-sub) (0.00s)
    --- PASS: TestACL/PreparedQueryDefaultAllow (0.01s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/ReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/WriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/ReadDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/WriteDenied.Prefix(other) (0.00s)
    --- PASS: TestACL/AgentNestedDefaultDeny (0.02s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/AgentNestedDefaultAllow (0.03s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyDeny (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyRead (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyWrite (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyNone (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyNone/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyNone/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyDeny (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyRead (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyWrite (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyNone (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyNone/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyNone/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyDeny (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyRead (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyWrite (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyNone (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyNone/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyNone/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyDeny (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyRead (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyWrite (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyNone (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyNone/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyNone/WriteDenied (0.00s)
    --- PASS: TestACL/NodeDefaultDeny (0.02s)
        --- PASS: TestACL/NodeDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/NodeDefaultAllow (0.02s)
        --- PASS: TestACL/NodeDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/SessionDefaultDeny (0.02s)
        --- PASS: TestACL/SessionDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/SessionDefaultAllow (0.01s)
        --- PASS: TestACL/SessionDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/Parent (0.02s)
        --- PASS: TestACL/Parent/KeyReadDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyReadDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadDenied.Prefix(fail) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(fail) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/Parent/ACLReadDenied (0.00s)
        --- PASS: TestACL/Parent/ACLWriteDenied (0.00s)
        --- PASS: TestACL/Parent/SnapshotDenied (0.00s)
        --- PASS: TestACL/Parent/IntentionDefaultAllowDenied (0.00s)
    --- PASS: TestACL/ComplexDefaultAllow (0.04s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(intbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(intbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionDefaultAllowAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zookeeper) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zookeeper) (0.00s)
    --- PASS: TestACL/ExactMatchPrecedence (0.05s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWriteDenied.Prefix(football) (0.00s)
    --- PASS: TestACL/ACLRead (0.00s)
        --- PASS: TestACL/ACLRead/ReadAllowed (0.00s)
        --- PASS: TestACL/ACLRead/WriteDenied (0.00s)
    --- PASS: TestACL/ACLRead#01 (0.00s)
        --- PASS: TestACL/ACLRead#01/ReadAllowed (0.00s)
        --- PASS: TestACL/ACLRead#01/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyWritePrefixDefaultDeny (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/DeniedTopLevelPrefix.Prefix(foo) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/AllowedTopLevelPrefix.Prefix(baz/) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/AllowedPrefixWithNestedWrite.Prefix(foo/) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/DenyPrefixWithNestedRead.Prefix(bar/) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/DenyNoPrefixMatch.Prefix(te) (0.00s)
    --- PASS: TestACL/KeyWritePrefixDefaultAllow (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixAllowed.Prefix(bar) (0.00s)
=== RUN   TestRootAuthorizer
--- PASS: TestRootAuthorizer (0.00s)
=== RUN   TestACLEnforce
=== RUN   TestACLEnforce/RuleNoneRequireRead
=== RUN   TestACLEnforce/RuleNoneRequireWrite
=== RUN   TestACLEnforce/RuleNoneRequireList
=== RUN   TestACLEnforce/RuleReadRequireRead
=== RUN   TestACLEnforce/RuleReadRequireWrite
=== RUN   TestACLEnforce/RuleReadRequireList
=== RUN   TestACLEnforce/RuleListRequireRead
=== RUN   TestACLEnforce/RuleListRequireWrite
=== RUN   TestACLEnforce/RuleListRequireList
=== RUN   TestACLEnforce/RuleWritetRequireRead
=== RUN   TestACLEnforce/RuleWritetRequireWrite
=== RUN   TestACLEnforce/RuleWritetRequireList
=== RUN   TestACLEnforce/RuleDenyRequireRead
=== RUN   TestACLEnforce/RuleDenyRequireWrite
=== RUN   TestACLEnforce/RuleDenyRequireList
--- PASS: TestACLEnforce (0.01s)
    --- PASS: TestACLEnforce/RuleNoneRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleNoneRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleNoneRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireList (0.00s)
=== RUN   TestPolicySourceParse
=== RUN   TestPolicySourceParse/Legacy_Basic
=== RUN   TestPolicySourceParse/Legacy_(JSON)
=== RUN   TestPolicySourceParse/Service_No_Intentions_(Legacy)
=== RUN   TestPolicySourceParse/Service_Intentions_(Legacy)
=== RUN   TestPolicySourceParse/Service_Intention:_invalid_value_(Legacy)
=== RUN   TestPolicySourceParse/Bad_Policy_-_ACL
=== RUN   TestPolicySourceParse/Bad_Policy_-_Agent
=== RUN   TestPolicySourceParse/Bad_Policy_-_Agent_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Key
=== RUN   TestPolicySourceParse/Bad_Policy_-_Key_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Node
=== RUN   TestPolicySourceParse/Bad_Policy_-_Node_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Service
=== RUN   TestPolicySourceParse/Bad_Policy_-_Service_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Session
=== RUN   TestPolicySourceParse/Bad_Policy_-_Session_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Event
=== RUN   TestPolicySourceParse/Bad_Policy_-_Event_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Prepared_Query
=== RUN   TestPolicySourceParse/Bad_Policy_-_Prepared_Query_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Keyring
=== RUN   TestPolicySourceParse/Bad_Policy_-_Operator
=== RUN   TestPolicySourceParse/Keyring_Empty
=== RUN   TestPolicySourceParse/Operator_Empty
--- PASS: TestPolicySourceParse (0.03s)
    --- PASS: TestPolicySourceParse/Legacy_Basic (0.00s)
    --- PASS: TestPolicySourceParse/Legacy_(JSON) (0.00s)
    --- PASS: TestPolicySourceParse/Service_No_Intentions_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Service_Intentions_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Service_Intention:_invalid_value_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_ACL (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Agent (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Agent_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Key (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Key_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Node (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Node_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Service (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Service_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Session (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Session_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Event (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Event_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Prepared_Query (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Prepared_Query_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Keyring (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Operator (0.00s)
    --- PASS: TestPolicySourceParse/Keyring_Empty (0.00s)
    --- PASS: TestPolicySourceParse/Operator_Empty (0.00s)
=== RUN   TestMergePolicies
=== RUN   TestMergePolicies/Agents
=== RUN   TestMergePolicies/Events
=== RUN   TestMergePolicies/Node
=== RUN   TestMergePolicies/Keys
=== RUN   TestMergePolicies/Services
=== RUN   TestMergePolicies/Sessions
=== RUN   TestMergePolicies/Prepared_Queries
=== RUN   TestMergePolicies/Write_Precedence
=== RUN   TestMergePolicies/Deny_Precedence
=== RUN   TestMergePolicies/Read_Precedence
--- PASS: TestMergePolicies (0.02s)
    --- PASS: TestMergePolicies/Agents (0.00s)
    --- PASS: TestMergePolicies/Events (0.00s)
    --- PASS: TestMergePolicies/Node (0.00s)
    --- PASS: TestMergePolicies/Keys (0.00s)
    --- PASS: TestMergePolicies/Services (0.00s)
    --- PASS: TestMergePolicies/Sessions (0.00s)
    --- PASS: TestMergePolicies/Prepared_Queries (0.00s)
    --- PASS: TestMergePolicies/Write_Precedence (0.00s)
    --- PASS: TestMergePolicies/Deny_Precedence (0.00s)
    --- PASS: TestMergePolicies/Read_Precedence (0.00s)
=== RUN   TestRulesTranslate
--- PASS: TestRulesTranslate (0.00s)
=== RUN   TestRulesTranslate_GH5493
--- PASS: TestRulesTranslate_GH5493 (0.00s)
=== RUN   TestPrecedence
=== RUN   TestPrecedence/Deny_Over_Write
=== RUN   TestPrecedence/Deny_Over_List
=== RUN   TestPrecedence/Deny_Over_Read
=== RUN   TestPrecedence/Deny_Over_Unknown
=== RUN   TestPrecedence/Write_Over_List
=== RUN   TestPrecedence/Write_Over_Read
=== RUN   TestPrecedence/Write_Over_Unknown
=== RUN   TestPrecedence/List_Over_Read
=== RUN   TestPrecedence/List_Over_Unknown
=== RUN   TestPrecedence/Read_Over_Unknown
=== RUN   TestPrecedence/Write_Over_Deny
=== RUN   TestPrecedence/List_Over_Deny
=== RUN   TestPrecedence/Read_Over_Deny
=== RUN   TestPrecedence/Deny_Over_Unknown#01
=== RUN   TestPrecedence/List_Over_Write
=== RUN   TestPrecedence/Read_Over_Write
=== RUN   TestPrecedence/Unknown_Over_Write
=== RUN   TestPrecedence/Read_Over_List
=== RUN   TestPrecedence/Unknown_Over_List
=== RUN   TestPrecedence/Unknown_Over_Read
--- PASS: TestPrecedence (0.01s)
    --- PASS: TestPrecedence/Deny_Over_Write (0.00s)
    --- PASS: TestPrecedence/Deny_Over_List (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Read (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Write_Over_List (0.00s)
    --- PASS: TestPrecedence/Write_Over_Read (0.00s)
    --- PASS: TestPrecedence/Write_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/List_Over_Read (0.00s)
    --- PASS: TestPrecedence/List_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Read_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Write_Over_Deny (0.00s)
    --- PASS: TestPrecedence/List_Over_Deny (0.00s)
    --- PASS: TestPrecedence/Read_Over_Deny (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Unknown#01 (0.00s)
    --- PASS: TestPrecedence/List_Over_Write (0.00s)
    --- PASS: TestPrecedence/Read_Over_Write (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_Write (0.00s)
    --- PASS: TestPrecedence/Read_Over_List (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_List (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_Read (0.00s)
PASS
ok  	github.com/hashicorp/consul/acl	0.514s
=== RUN   TestACL_Legacy_Disabled_Response
=== PAUSE TestACL_Legacy_Disabled_Response
=== RUN   TestACL_Legacy_Update
=== PAUSE TestACL_Legacy_Update
=== RUN   TestACL_Legacy_UpdateUpsert
=== PAUSE TestACL_Legacy_UpdateUpsert
=== RUN   TestACL_Legacy_Destroy
=== PAUSE TestACL_Legacy_Destroy
=== RUN   TestACL_Legacy_Clone
=== PAUSE TestACL_Legacy_Clone
=== RUN   TestACL_Legacy_Get
=== PAUSE TestACL_Legacy_Get
=== RUN   TestACL_Legacy_List
--- SKIP: TestACL_Legacy_List (0.00s)
    acl_endpoint_legacy_test.go:253: DM-skipped
=== RUN   TestACLReplicationStatus
=== PAUSE TestACLReplicationStatus
=== RUN   TestACL_Disabled_Response
=== PAUSE TestACL_Disabled_Response
=== RUN   TestACL_Bootstrap
=== PAUSE TestACL_Bootstrap
=== RUN   TestACL_HTTP
=== PAUSE TestACL_HTTP
=== RUN   TestACL_LoginProcedure_HTTP
=== PAUSE TestACL_LoginProcedure_HTTP
=== RUN   TestACL_Version8
=== PAUSE TestACL_Version8
=== RUN   TestACL_AgentMasterToken
=== PAUSE TestACL_AgentMasterToken
=== RUN   TestACL_RootAuthorizersDenied
=== PAUSE TestACL_RootAuthorizersDenied
=== RUN   TestACL_vetServiceRegister
=== PAUSE TestACL_vetServiceRegister
=== RUN   TestACL_vetServiceUpdate
=== PAUSE TestACL_vetServiceUpdate
=== RUN   TestACL_vetCheckRegister
=== PAUSE TestACL_vetCheckRegister
=== RUN   TestACL_vetCheckUpdate
=== PAUSE TestACL_vetCheckUpdate
=== RUN   TestACL_filterMembers
=== PAUSE TestACL_filterMembers
=== RUN   TestACL_filterServices
=== PAUSE TestACL_filterServices
=== RUN   TestACL_filterChecks
=== PAUSE TestACL_filterChecks
=== RUN   TestAgent_Services
=== PAUSE TestAgent_Services
=== RUN   TestAgent_ServicesFiltered
=== PAUSE TestAgent_ServicesFiltered
=== RUN   TestAgent_Services_ExternalConnectProxy
=== PAUSE TestAgent_Services_ExternalConnectProxy
=== RUN   TestAgent_Services_Sidecar
=== PAUSE TestAgent_Services_Sidecar
=== RUN   TestAgent_Services_ACLFilter
=== PAUSE TestAgent_Services_ACLFilter
=== RUN   TestAgent_Service
--- SKIP: TestAgent_Service (0.00s)
    agent_endpoint_test.go:274: DM-skipped
=== RUN   TestAgent_Service_DeprecatedManagedProxy
=== PAUSE TestAgent_Service_DeprecatedManagedProxy
=== RUN   TestAgent_Checks
=== PAUSE TestAgent_Checks
=== RUN   TestAgent_ChecksWithFilter
=== PAUSE TestAgent_ChecksWithFilter
=== RUN   TestAgent_HealthServiceByID
=== PAUSE TestAgent_HealthServiceByID
=== RUN   TestAgent_HealthServiceByName
=== PAUSE TestAgent_HealthServiceByName
=== RUN   TestAgent_Checks_ACLFilter
=== PAUSE TestAgent_Checks_ACLFilter
=== RUN   TestAgent_Self
=== PAUSE TestAgent_Self
=== RUN   TestAgent_Self_ACLDeny
=== PAUSE TestAgent_Self_ACLDeny
=== RUN   TestAgent_Metrics_ACLDeny
=== PAUSE TestAgent_Metrics_ACLDeny
=== RUN   TestAgent_Reload
=== PAUSE TestAgent_Reload
=== RUN   TestAgent_Reload_ACLDeny
=== PAUSE TestAgent_Reload_ACLDeny
=== RUN   TestAgent_Members
=== PAUSE TestAgent_Members
=== RUN   TestAgent_Members_WAN
=== PAUSE TestAgent_Members_WAN
=== RUN   TestAgent_Members_ACLFilter
=== PAUSE TestAgent_Members_ACLFilter
=== RUN   TestAgent_Join
=== PAUSE TestAgent_Join
=== RUN   TestAgent_Join_WAN
=== PAUSE TestAgent_Join_WAN
=== RUN   TestAgent_Join_ACLDeny
=== PAUSE TestAgent_Join_ACLDeny
=== RUN   TestAgent_JoinLANNotify
=== PAUSE TestAgent_JoinLANNotify
=== RUN   TestAgent_Leave
--- SKIP: TestAgent_Leave (0.00s)
    agent_endpoint_test.go:1581: DM-skipped
=== RUN   TestAgent_Leave_ACLDeny
=== PAUSE TestAgent_Leave_ACLDeny
=== RUN   TestAgent_ForceLeave
--- SKIP: TestAgent_ForceLeave (0.00s)
    agent_endpoint_test.go:1649: DM-skipped
=== RUN   TestAgent_ForceLeave_ACLDeny
=== PAUSE TestAgent_ForceLeave_ACLDeny
=== RUN   TestAgent_RegisterCheck
=== PAUSE TestAgent_RegisterCheck
=== RUN   TestAgent_RegisterCheck_Scripts
=== PAUSE TestAgent_RegisterCheck_Scripts
=== RUN   TestAgent_RegisterCheckScriptsExecDisable
=== PAUSE TestAgent_RegisterCheckScriptsExecDisable
=== RUN   TestAgent_RegisterCheckScriptsExecRemoteDisable
=== PAUSE TestAgent_RegisterCheckScriptsExecRemoteDisable
=== RUN   TestAgent_RegisterCheck_Passing
=== PAUSE TestAgent_RegisterCheck_Passing
=== RUN   TestAgent_RegisterCheck_BadStatus
=== PAUSE TestAgent_RegisterCheck_BadStatus
=== RUN   TestAgent_RegisterCheck_ACLDeny
=== PAUSE TestAgent_RegisterCheck_ACLDeny
=== RUN   TestAgent_DeregisterCheck
=== PAUSE TestAgent_DeregisterCheck
=== RUN   TestAgent_DeregisterCheckACLDeny
=== PAUSE TestAgent_DeregisterCheckACLDeny
=== RUN   TestAgent_PassCheck
=== PAUSE TestAgent_PassCheck
=== RUN   TestAgent_PassCheck_ACLDeny
=== PAUSE TestAgent_PassCheck_ACLDeny
=== RUN   TestAgent_WarnCheck
=== PAUSE TestAgent_WarnCheck
=== RUN   TestAgent_WarnCheck_ACLDeny
=== PAUSE TestAgent_WarnCheck_ACLDeny
=== RUN   TestAgent_FailCheck
=== PAUSE TestAgent_FailCheck
=== RUN   TestAgent_FailCheck_ACLDeny
=== PAUSE TestAgent_FailCheck_ACLDeny
=== RUN   TestAgent_UpdateCheck
=== PAUSE TestAgent_UpdateCheck
=== RUN   TestAgent_UpdateCheck_ACLDeny
=== PAUSE TestAgent_UpdateCheck_ACLDeny
=== RUN   TestAgent_RegisterService
=== PAUSE TestAgent_RegisterService
=== RUN   TestAgent_RegisterService_TranslateKeys
=== PAUSE TestAgent_RegisterService_TranslateKeys
=== RUN   TestAgent_RegisterService_ACLDeny
=== PAUSE TestAgent_RegisterService_ACLDeny
=== RUN   TestAgent_RegisterService_InvalidAddress
=== PAUSE TestAgent_RegisterService_InvalidAddress
=== RUN   TestAgent_RegisterService_ManagedConnectProxy
=== PAUSE TestAgent_RegisterService_ManagedConnectProxy
=== RUN   TestAgent_RegisterService_ManagedConnectProxyDeprecated
=== PAUSE TestAgent_RegisterService_ManagedConnectProxyDeprecated
=== RUN   TestAgent_RegisterService_ManagedConnectProxy_Disabled
=== PAUSE TestAgent_RegisterService_ManagedConnectProxy_Disabled
=== RUN   TestAgent_RegisterService_UnmanagedConnectProxy
=== PAUSE TestAgent_RegisterService_UnmanagedConnectProxy
=== RUN   TestAgent_RegisterServiceDeregisterService_Sidecar
--- SKIP: TestAgent_RegisterServiceDeregisterService_Sidecar (0.00s)
    agent_endpoint_test.go:3107: DM-skipped
=== RUN   TestAgent_RegisterService_UnmanagedConnectProxyInvalid
=== PAUSE TestAgent_RegisterService_UnmanagedConnectProxyInvalid
=== RUN   TestAgent_RegisterService_ConnectNative
=== PAUSE TestAgent_RegisterService_ConnectNative
=== RUN   TestAgent_RegisterService_ScriptCheck_ExecDisable
=== PAUSE TestAgent_RegisterService_ScriptCheck_ExecDisable
=== RUN   TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable
=== PAUSE TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable
=== RUN   TestAgent_DeregisterService
=== PAUSE TestAgent_DeregisterService
=== RUN   TestAgent_DeregisterService_ACLDeny
=== PAUSE TestAgent_DeregisterService_ACLDeny
=== RUN   TestAgent_DeregisterService_withManagedProxy
=== PAUSE TestAgent_DeregisterService_withManagedProxy
=== RUN   TestAgent_DeregisterService_managedProxyDirect
=== PAUSE TestAgent_DeregisterService_managedProxyDirect
=== RUN   TestAgent_ServiceMaintenance_BadRequest
=== PAUSE TestAgent_ServiceMaintenance_BadRequest
=== RUN   TestAgent_ServiceMaintenance_Enable
--- SKIP: TestAgent_ServiceMaintenance_Enable (0.00s)
    agent_endpoint_test.go:3936: DM-skipped
=== RUN   TestAgent_ServiceMaintenance_Disable
=== PAUSE TestAgent_ServiceMaintenance_Disable
=== RUN   TestAgent_ServiceMaintenance_ACLDeny
=== PAUSE TestAgent_ServiceMaintenance_ACLDeny
=== RUN   TestAgent_NodeMaintenance_BadRequest
=== PAUSE TestAgent_NodeMaintenance_BadRequest
=== RUN   TestAgent_NodeMaintenance_Enable
=== PAUSE TestAgent_NodeMaintenance_Enable
=== RUN   TestAgent_NodeMaintenance_Disable
=== PAUSE TestAgent_NodeMaintenance_Disable
=== RUN   TestAgent_NodeMaintenance_ACLDeny
=== PAUSE TestAgent_NodeMaintenance_ACLDeny
=== RUN   TestAgent_RegisterCheck_Service
=== PAUSE TestAgent_RegisterCheck_Service
=== RUN   TestAgent_Monitor
--- SKIP: TestAgent_Monitor (0.00s)
    agent_endpoint_test.go:4189: DM-skipped
=== RUN   TestAgent_Monitor_ACLDeny
=== PAUSE TestAgent_Monitor_ACLDeny
=== RUN   TestAgent_Token
=== PAUSE TestAgent_Token
=== RUN   TestAgentConnectCARoots_empty
=== PAUSE TestAgentConnectCARoots_empty
=== RUN   TestAgentConnectCARoots_list
=== PAUSE TestAgentConnectCARoots_list
=== RUN   TestAgentConnectCALeafCert_aclDefaultDeny
=== PAUSE TestAgentConnectCALeafCert_aclDefaultDeny
=== RUN   TestAgentConnectCALeafCert_aclProxyToken
=== PAUSE TestAgentConnectCALeafCert_aclProxyToken
=== RUN   TestAgentConnectCALeafCert_aclProxyTokenOther
=== PAUSE TestAgentConnectCALeafCert_aclProxyTokenOther
=== RUN   TestAgentConnectCALeafCert_aclServiceWrite
=== PAUSE TestAgentConnectCALeafCert_aclServiceWrite
=== RUN   TestAgentConnectCALeafCert_aclServiceReadDeny
=== PAUSE TestAgentConnectCALeafCert_aclServiceReadDeny
=== RUN   TestAgentConnectCALeafCert_good
=== PAUSE TestAgentConnectCALeafCert_good
=== RUN   TestAgentConnectCALeafCert_goodNotLocal
--- SKIP: TestAgentConnectCALeafCert_goodNotLocal (0.00s)
    agent_endpoint_test.go:4991: DM-skipped
=== RUN   TestAgentConnectProxyConfig_Blocking
--- SKIP: TestAgentConnectProxyConfig_Blocking (0.00s)
    agent_endpoint_test.go:5128: DM-skipped
=== RUN   TestAgentConnectProxyConfig_aclDefaultDeny
=== PAUSE TestAgentConnectProxyConfig_aclDefaultDeny
=== RUN   TestAgentConnectProxyConfig_aclProxyToken
=== PAUSE TestAgentConnectProxyConfig_aclProxyToken
=== RUN   TestAgentConnectProxyConfig_aclServiceWrite
=== PAUSE TestAgentConnectProxyConfig_aclServiceWrite
=== RUN   TestAgentConnectProxyConfig_aclServiceReadDeny
=== PAUSE TestAgentConnectProxyConfig_aclServiceReadDeny
=== RUN   TestAgentConnectProxyConfig_ConfigHandling
--- SKIP: TestAgentConnectProxyConfig_ConfigHandling (0.00s)
    agent_endpoint_test.go:5540: DM-skipped
=== RUN   TestAgentConnectAuthorize_badBody
=== PAUSE TestAgentConnectAuthorize_badBody
=== RUN   TestAgentConnectAuthorize_noTarget
=== PAUSE TestAgentConnectAuthorize_noTarget
=== RUN   TestAgentConnectAuthorize_idInvalidFormat
=== PAUSE TestAgentConnectAuthorize_idInvalidFormat
=== RUN   TestAgentConnectAuthorize_idNotService
=== PAUSE TestAgentConnectAuthorize_idNotService
=== RUN   TestAgentConnectAuthorize_allow
=== PAUSE TestAgentConnectAuthorize_allow
=== RUN   TestAgentConnectAuthorize_deny
=== PAUSE TestAgentConnectAuthorize_deny
=== RUN   TestAgentConnectAuthorize_allowTrustDomain
=== PAUSE TestAgentConnectAuthorize_allowTrustDomain
=== RUN   TestAgentConnectAuthorize_denyWildcard
=== PAUSE TestAgentConnectAuthorize_denyWildcard
=== RUN   TestAgentConnectAuthorize_serviceWrite
=== PAUSE TestAgentConnectAuthorize_serviceWrite
=== RUN   TestAgentConnectAuthorize_defaultDeny
=== PAUSE TestAgentConnectAuthorize_defaultDeny
=== RUN   TestAgentConnectAuthorize_defaultAllow
=== PAUSE TestAgentConnectAuthorize_defaultAllow
=== RUN   TestAgent_Host
=== PAUSE TestAgent_Host
=== RUN   TestAgent_HostBadACL
=== PAUSE TestAgent_HostBadACL
=== RUN   TestAgent_MultiStartStop
=== RUN   TestAgent_MultiStartStop/#00
=== PAUSE TestAgent_MultiStartStop/#00
=== RUN   TestAgent_MultiStartStop/#01
=== PAUSE TestAgent_MultiStartStop/#01
=== RUN   TestAgent_MultiStartStop/#02
=== PAUSE TestAgent_MultiStartStop/#02
=== RUN   TestAgent_MultiStartStop/#03
=== PAUSE TestAgent_MultiStartStop/#03
=== RUN   TestAgent_MultiStartStop/#04
=== PAUSE TestAgent_MultiStartStop/#04
=== RUN   TestAgent_MultiStartStop/#05
=== PAUSE TestAgent_MultiStartStop/#05
=== RUN   TestAgent_MultiStartStop/#06
=== PAUSE TestAgent_MultiStartStop/#06
=== RUN   TestAgent_MultiStartStop/#07
=== PAUSE TestAgent_MultiStartStop/#07
=== RUN   TestAgent_MultiStartStop/#08
=== PAUSE TestAgent_MultiStartStop/#08
=== RUN   TestAgent_MultiStartStop/#09
=== PAUSE TestAgent_MultiStartStop/#09
=== CONT  TestAgent_MultiStartStop/#00
=== CONT  TestAgent_MultiStartStop/#08
=== CONT  TestAgent_MultiStartStop/#03
=== CONT  TestAgent_MultiStartStop/#05
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:10.797846 [WARN] agent: Node name "Node 813fe2bc-7956-21a7-19ed-5e1d27fff65c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:10.798992 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:10.824732 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:10.875925 [WARN] agent: Node name "Node 78f2e382-1a50-9b18-0478-4efc3662f011" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:10.876584 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:10.878914 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:10.888311 [WARN] agent: Node name "Node eb70c188-e9db-d339-2535-6932b1b3ea2d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:10.892917 [WARN] agent: Node name "Node c94d2e75-3c47-eeef-31dd-d6c3243177ec" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:10.893271 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:10.893289 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:10.895740 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:10.901366 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:813fe2bc-7956-21a7-19ed-5e1d27fff65c Address:127.0.0.1:17524}]
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17524 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.568743 [INFO] serf: EventMemberJoin: Node 813fe2bc-7956-21a7-19ed-5e1d27fff65c.dc1 127.0.0.1
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.573087 [INFO] serf: EventMemberJoin: Node 813fe2bc-7956-21a7-19ed-5e1d27fff65c 127.0.0.1
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.574121 [INFO] consul: Adding LAN server Node 813fe2bc-7956-21a7-19ed-5e1d27fff65c (Addr: tcp/127.0.0.1:17524) (DC: dc1)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.574152 [INFO] consul: Handled member-join event for server "Node 813fe2bc-7956-21a7-19ed-5e1d27fff65c.dc1" in area "wan"
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.575026 [INFO] agent: Started DNS server 127.0.0.1:17519 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.575586 [INFO] agent: Started DNS server 127.0.0.1:17519 (udp)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.578498 [INFO] agent: Started HTTP server on 127.0.0.1:17520 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:12.578661 [INFO] agent: started state syncer
2019/12/30 18:51:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17524 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb70c188-e9db-d339-2535-6932b1b3ea2d Address:127.0.0.1:17506}]
2019/12/30 18:51:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:78f2e382-1a50-9b18-0478-4efc3662f011 Address:127.0.0.1:17518}]
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17518 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.732949 [INFO] serf: EventMemberJoin: Node eb70c188-e9db-d339-2535-6932b1b3ea2d.dc1 127.0.0.1
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.736963 [INFO] serf: EventMemberJoin: Node eb70c188-e9db-d339-2535-6932b1b3ea2d 127.0.0.1
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.737881 [INFO] serf: EventMemberJoin: Node 78f2e382-1a50-9b18-0478-4efc3662f011.dc1 127.0.0.1
2019/12/30 18:51:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c94d2e75-3c47-eeef-31dd-d6c3243177ec Address:127.0.0.1:17512}]
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17512 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.739655 [INFO] consul: Adding LAN server Node eb70c188-e9db-d339-2535-6932b1b3ea2d (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.739931 [INFO] consul: Handled member-join event for server "Node eb70c188-e9db-d339-2535-6932b1b3ea2d.dc1" in area "wan"
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.765434 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.765545 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.767482 [INFO] serf: EventMemberJoin: Node c94d2e75-3c47-eeef-31dd-d6c3243177ec.dc1 127.0.0.1
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.767930 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:12.768018 [INFO] agent: started state syncer
2019/12/30 18:51:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.771473 [INFO] serf: EventMemberJoin: Node 78f2e382-1a50-9b18-0478-4efc3662f011 127.0.0.1
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.772549 [INFO] consul: Adding LAN server Node 78f2e382-1a50-9b18-0478-4efc3662f011 (Addr: tcp/127.0.0.1:17518) (DC: dc1)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.772836 [INFO] consul: Handled member-join event for server "Node 78f2e382-1a50-9b18-0478-4efc3662f011.dc1" in area "wan"
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.773613 [INFO] agent: Started DNS server 127.0.0.1:17513 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.773701 [INFO] agent: Started DNS server 127.0.0.1:17513 (udp)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.782913 [INFO] agent: Started HTTP server on 127.0.0.1:17514 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:12.783074 [INFO] agent: started state syncer
2019/12/30 18:51:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17518 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.786682 [INFO] serf: EventMemberJoin: Node c94d2e75-3c47-eeef-31dd-d6c3243177ec 127.0.0.1
2019/12/30 18:51:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:17512 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.794619 [INFO] agent: Started DNS server 127.0.0.1:17507 (udp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.795554 [INFO] consul: Handled member-join event for server "Node c94d2e75-3c47-eeef-31dd-d6c3243177ec.dc1" in area "wan"
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.800396 [INFO] agent: Started DNS server 127.0.0.1:17507 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.804755 [INFO] consul: Adding LAN server Node c94d2e75-3c47-eeef-31dd-d6c3243177ec (Addr: tcp/127.0.0.1:17512) (DC: dc1)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.805921 [INFO] agent: Started HTTP server on 127.0.0.1:17508 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:12.806052 [INFO] agent: started state syncer
2019/12/30 18:51:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:13 [INFO]  raft: Node at 127.0.0.1:17524 [Leader] entering Leader state
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:13.480639 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:13.481200 [INFO] consul: New leader elected: Node 813fe2bc-7956-21a7-19ed-5e1d27fff65c
2019/12/30 18:51:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:13 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
2019/12/30 18:51:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:13 [INFO]  raft: Node at 127.0.0.1:17518 [Leader] entering Leader state
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:13.639811 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:13.640261 [INFO] consul: New leader elected: Node eb70c188-e9db-d339-2535-6932b1b3ea2d
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:13.640498 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:13.640827 [INFO] consul: New leader elected: Node 78f2e382-1a50-9b18-0478-4efc3662f011
2019/12/30 18:51:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:13 [INFO]  raft: Node at 127.0.0.1:17512 [Leader] entering Leader state
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:13.643879 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:13.644305 [INFO] consul: New leader elected: Node c94d2e75-3c47-eeef-31dd-d6c3243177ec
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:13.934875 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:13.934996 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:13.935048 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:13.997410 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:13.997511 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:13.997557 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.012420 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.012538 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.012591 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.046223 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.046375 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.046429 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.219629 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.219629 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.220911 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.223189 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.223315 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.228260 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.378150 [INFO] manager: shutting down
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.378150 [INFO] manager: shutting down
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.378650 [INFO] manager: shutting down
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.378815 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.378940 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.378944 [INFO] agent: consul server down
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379047 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379103 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379005 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379275 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (udp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379551 [INFO] agent: Stopping HTTP server 127.0.0.1:17508 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379888 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#08 - 2019/12/30 18:51:14.379975 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#02
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.380508 [INFO] manager: shutting down
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.382470 [INFO] agent: consul server down
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.382868 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.383076 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.382550 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.382610 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.383247 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.383511 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.383808 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.384192 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#00 - 2019/12/30 18:51:14.384464 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#01
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385187 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385313 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385364 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385437 [INFO] agent: consul server down
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385485 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385540 [INFO] agent: Stopping DNS server 127.0.0.1:17513 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385673 [INFO] agent: Stopping DNS server 127.0.0.1:17513 (udp)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.385826 [INFO] agent: Stopping HTTP server 127.0.0.1:17514 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.386047 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#03 - 2019/12/30 18:51:14.386111 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#07
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:14.443368 [WARN] agent: Node name "Node 67cc6b68-fb25-2f27-8378-c0d3ed203d81" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:14.444035 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:14.446474 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:14.473354 [WARN] agent: Node name "Node 136ba6a8-3dd7-1fd0-b2f8-e830202d1a14" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:14.473789 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:14.474305 [WARN] agent: Node name "Node 81dbf104-042a-ad4b-3210-89b021413c7c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:14.474866 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:14.476082 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:14.477137 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.511685 [INFO] agent: consul server down
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.511775 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.511841 [INFO] agent: Stopping DNS server 127.0.0.1:17519 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.512019 [INFO] agent: Stopping DNS server 127.0.0.1:17519 (udp)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.512197 [INFO] agent: Stopping HTTP server 127.0.0.1:17520 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.512462 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.512540 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#09
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.514762 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#05 - 2019/12/30 18:51:14.515317 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:14.593215 [WARN] agent: Node name "Node 8d410c2b-75e9-cb4c-754c-0b43e8995794" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:14.594081 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:14.596981 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:67cc6b68-fb25-2f27-8378-c0d3ed203d81 Address:127.0.0.1:17536}]
2019/12/30 18:51:16 [INFO]  raft: Node at 127.0.0.1:17536 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.834208 [INFO] serf: EventMemberJoin: Node 67cc6b68-fb25-2f27-8378-c0d3ed203d81.dc1 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.837530 [INFO] serf: EventMemberJoin: Node 67cc6b68-fb25-2f27-8378-c0d3ed203d81 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.838495 [INFO] consul: Adding LAN server Node 67cc6b68-fb25-2f27-8378-c0d3ed203d81 (Addr: tcp/127.0.0.1:17536) (DC: dc1)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.838836 [INFO] consul: Handled member-join event for server "Node 67cc6b68-fb25-2f27-8378-c0d3ed203d81.dc1" in area "wan"
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.838864 [INFO] agent: Started DNS server 127.0.0.1:17531 (udp)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.839219 [INFO] agent: Started DNS server 127.0.0.1:17531 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.846731 [INFO] agent: Started HTTP server on 127.0.0.1:17532 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:16.846841 [INFO] agent: started state syncer
2019/12/30 18:51:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:16 [INFO]  raft: Node at 127.0.0.1:17536 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:136ba6a8-3dd7-1fd0-b2f8-e830202d1a14 Address:127.0.0.1:17542}]
2019/12/30 18:51:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:81dbf104-042a-ad4b-3210-89b021413c7c Address:127.0.0.1:17530}]
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:16.973880 [INFO] serf: EventMemberJoin: Node 136ba6a8-3dd7-1fd0-b2f8-e830202d1a14.dc1 127.0.0.1
2019/12/30 18:51:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8d410c2b-75e9-cb4c-754c-0b43e8995794 Address:127.0.0.1:17548}]
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:16.976817 [INFO] serf: EventMemberJoin: Node 81dbf104-042a-ad4b-3210-89b021413c7c.dc1 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:16.978337 [INFO] serf: EventMemberJoin: Node 8d410c2b-75e9-cb4c-754c-0b43e8995794.dc1 127.0.0.1
2019/12/30 18:51:16 [INFO]  raft: Node at 127.0.0.1:17548 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:16.994295 [INFO] serf: EventMemberJoin: Node 81dbf104-042a-ad4b-3210-89b021413c7c 127.0.0.1
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.016712 [INFO] serf: EventMemberJoin: Node 136ba6a8-3dd7-1fd0-b2f8-e830202d1a14 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:16.995417 [INFO] serf: EventMemberJoin: Node 8d410c2b-75e9-cb4c-754c-0b43e8995794 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.038760 [INFO] agent: Started DNS server 127.0.0.1:17543 (udp)
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17542 [Follower] entering Follower state (Leader: "")
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17530 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.039149 [INFO] consul: Handled member-join event for server "Node 8d410c2b-75e9-cb4c-754c-0b43e8995794.dc1" in area "wan"
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.039689 [INFO] agent: Started DNS server 127.0.0.1:17543 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.017608 [INFO] agent: Started DNS server 127.0.0.1:17525 (udp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.041935 [INFO] agent: Started DNS server 127.0.0.1:17525 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.042257 [INFO] agent: Started HTTP server on 127.0.0.1:17544 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.042340 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.044097 [INFO] agent: Started DNS server 127.0.0.1:17537 (udp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.044310 [INFO] agent: Started HTTP server on 127.0.0.1:17526 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.044447 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.044678 [INFO] consul: Adding LAN server Node 136ba6a8-3dd7-1fd0-b2f8-e830202d1a14 (Addr: tcp/127.0.0.1:17542) (DC: dc1)
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.044863 [INFO] consul: Handled member-join event for server "Node 136ba6a8-3dd7-1fd0-b2f8-e830202d1a14.dc1" in area "wan"
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.044935 [INFO] consul: Adding LAN server Node 8d410c2b-75e9-cb4c-754c-0b43e8995794 (Addr: tcp/127.0.0.1:17548) (DC: dc1)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.018950 [INFO] consul: Handled member-join event for server "Node 81dbf104-042a-ad4b-3210-89b021413c7c.dc1" in area "wan"
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.045347 [INFO] agent: Started DNS server 127.0.0.1:17537 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.037036 [INFO] consul: Adding LAN server Node 81dbf104-042a-ad4b-3210-89b021413c7c (Addr: tcp/127.0.0.1:17530) (DC: dc1)
2019/12/30 18:51:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17548 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.047954 [INFO] agent: Started HTTP server on 127.0.0.1:17538 (tcp)
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.048052 [INFO] agent: started state syncer
2019/12/30 18:51:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17542 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17530 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17536 [Leader] entering Leader state
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:17.822713 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:17.823132 [INFO] consul: New leader elected: Node 67cc6b68-fb25-2f27-8378-c0d3ed203d81
2019/12/30 18:51:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17548 [Leader] entering Leader state
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17542 [Leader] entering Leader state
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.964292 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.964318 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:17.964722 [INFO] consul: New leader elected: Node 8d410c2b-75e9-cb4c-754c-0b43e8995794
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:17.964722 [INFO] consul: New leader elected: Node 136ba6a8-3dd7-1fd0-b2f8-e830202d1a14
2019/12/30 18:51:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:17530 [Leader] entering Leader state
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.974460 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:17.974887 [INFO] consul: New leader elected: Node 81dbf104-042a-ad4b-3210-89b021413c7c
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.114992 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.115098 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.115154 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.316376 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.318519 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.318598 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.318640 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.398762 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.398873 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.398935 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.422501 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.422655 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.425115 [INFO] manager: shutting down
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.425366 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.462301 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.462426 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.462487 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.536288 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.536379 [INFO] manager: shutting down
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.536461 [WARN] agent: Syncing node info failed. raft is already shutdown
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.536529 [ERR] agent: failed to sync remote state: raft is already shutdown
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537052 [INFO] agent: consul server down
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537116 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537178 [INFO] agent: Stopping DNS server 127.0.0.1:17543 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537374 [INFO] agent: Stopping DNS server 127.0.0.1:17543 (udp)
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537560 [INFO] agent: Stopping HTTP server 127.0.0.1:17544 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537779 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.537854 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#06
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.538589 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.542055 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.542665 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.542896 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.543094 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_MultiStartStop/#09 - 2019/12/30 18:51:18.543194 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:18.593874 [WARN] agent: Node name "Node b24deb9d-083b-2336-d1c7-c286c5ab37d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:18.594465 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:18.596791 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.669706 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.670091 [INFO] manager: shutting down
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.671686 [INFO] agent: consul server down
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.671753 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.671816 [INFO] agent: Stopping DNS server 127.0.0.1:17531 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.671994 [INFO] agent: Stopping DNS server 127.0.0.1:17531 (udp)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.672179 [INFO] agent: Stopping HTTP server 127.0.0.1:17532 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.672404 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.672488 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.672705 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#01 - 2019/12/30 18:51:18.672998 [ERR] consul: failed to establish leadership: raft is already shutdown
=== CONT  TestAgent_MultiStartStop/#04
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:18.732214 [WARN] agent: Node name "Node 39056900-ec78-7790-7d5b-1b445479129a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:18.732658 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:18.734937 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.886577 [INFO] manager: shutting down
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.887462 [INFO] agent: consul server down
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.887515 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.887568 [INFO] agent: Stopping DNS server 127.0.0.1:17525 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.887749 [INFO] agent: Stopping DNS server 127.0.0.1:17525 (udp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.887926 [INFO] agent: Stopping HTTP server 127.0.0.1:17526 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.888140 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.888207 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#02 - 2019/12/30 18:51:18.888631 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.891270 [INFO] agent: consul server down
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.891356 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.891421 [INFO] agent: Stopping DNS server 127.0.0.1:17537 (tcp)
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.891604 [INFO] agent: Stopping DNS server 127.0.0.1:17537 (udp)
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.891787 [INFO] agent: Stopping HTTP server 127.0.0.1:17538 (tcp)
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.892000 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.892079 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.892297 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#07 - 2019/12/30 18:51:18.892575 [ERR] consul: failed to establish leadership: raft is already shutdown
2019/12/30 18:51:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b24deb9d-083b-2336-d1c7-c286c5ab37d7 Address:127.0.0.1:17554}]
2019/12/30 18:51:19 [INFO]  raft: Node at 127.0.0.1:17554 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.892778 [INFO] serf: EventMemberJoin: Node b24deb9d-083b-2336-d1c7-c286c5ab37d7.dc1 127.0.0.1
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.896147 [INFO] serf: EventMemberJoin: Node b24deb9d-083b-2336-d1c7-c286c5ab37d7 127.0.0.1
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.896907 [INFO] consul: Handled member-join event for server "Node b24deb9d-083b-2336-d1c7-c286c5ab37d7.dc1" in area "wan"
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.897220 [INFO] consul: Adding LAN server Node b24deb9d-083b-2336-d1c7-c286c5ab37d7 (Addr: tcp/127.0.0.1:17554) (DC: dc1)
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.901898 [INFO] agent: Started DNS server 127.0.0.1:17549 (udp)
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.902066 [INFO] agent: Started DNS server 127.0.0.1:17549 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.909359 [INFO] agent: Started HTTP server on 127.0.0.1:17550 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:19.910877 [INFO] agent: started state syncer
2019/12/30 18:51:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:19 [INFO]  raft: Node at 127.0.0.1:17554 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:39056900-ec78-7790-7d5b-1b445479129a Address:127.0.0.1:17560}]
2019/12/30 18:51:20 [INFO]  raft: Node at 127.0.0.1:17560 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.047053 [INFO] serf: EventMemberJoin: Node 39056900-ec78-7790-7d5b-1b445479129a.dc1 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.051651 [INFO] serf: EventMemberJoin: Node 39056900-ec78-7790-7d5b-1b445479129a 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.053790 [INFO] consul: Adding LAN server Node 39056900-ec78-7790-7d5b-1b445479129a (Addr: tcp/127.0.0.1:17560) (DC: dc1)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.055027 [INFO] consul: Handled member-join event for server "Node 39056900-ec78-7790-7d5b-1b445479129a.dc1" in area "wan"
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.056717 [INFO] agent: Started DNS server 127.0.0.1:17555 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.057408 [INFO] agent: Started DNS server 127.0.0.1:17555 (udp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.066937 [INFO] agent: Started HTTP server on 127.0.0.1:17556 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:20.067173 [INFO] agent: started state syncer
2019/12/30 18:51:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:20 [INFO]  raft: Node at 127.0.0.1:17560 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:20 [INFO]  raft: Node at 127.0.0.1:17554 [Leader] entering Leader state
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:20.799819 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:20.800234 [INFO] consul: New leader elected: Node b24deb9d-083b-2336-d1c7-c286c5ab37d7
2019/12/30 18:51:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:21 [INFO]  raft: Node at 127.0.0.1:17560 [Leader] entering Leader state
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.208799 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.209219 [INFO] consul: New leader elected: Node 39056900-ec78-7790-7d5b-1b445479129a
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.322893 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.322996 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.323043 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.488944 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.490514 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.503431 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.503527 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.503576 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.728016 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.732635 [INFO] manager: shutting down
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.869849 [INFO] manager: shutting down
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.955719 [INFO] agent: consul server down
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.955795 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.955852 [INFO] agent: Stopping DNS server 127.0.0.1:17555 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956001 [INFO] agent: Stopping DNS server 127.0.0.1:17555 (udp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956159 [INFO] agent: Stopping HTTP server 127.0.0.1:17556 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956279 [INFO] agent: consul server down
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956325 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956343 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956388 [INFO] agent: Stopping DNS server 127.0.0.1:17549 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956403 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956505 [INFO] agent: Stopping DNS server 127.0.0.1:17549 (udp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956563 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956637 [INFO] agent: Stopping HTTP server 127.0.0.1:17550 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956653 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#04 - 2019/12/30 18:51:21.956703 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956817 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956877 [INFO] agent: Endpoints down
--- PASS: TestAgent_MultiStartStop (0.00s)
    --- PASS: TestAgent_MultiStartStop/#08 (3.76s)
    --- PASS: TestAgent_MultiStartStop/#00 (3.77s)
    --- PASS: TestAgent_MultiStartStop/#03 (3.76s)
    --- PASS: TestAgent_MultiStartStop/#05 (3.88s)
    --- PASS: TestAgent_MultiStartStop/#09 (4.03s)
    --- PASS: TestAgent_MultiStartStop/#01 (4.29s)
    --- PASS: TestAgent_MultiStartStop/#02 (4.51s)
    --- PASS: TestAgent_MultiStartStop/#07 (4.51s)
    --- PASS: TestAgent_MultiStartStop/#04 (3.28s)
    --- PASS: TestAgent_MultiStartStop/#06 (3.42s)
=== RUN   TestAgent_ConnectClusterIDConfig
=== RUN   TestAgent_ConnectClusterIDConfig/default_TestAgent_has_fixed_cluster_id
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.956877 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.958138 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#06 - 2019/12/30 18:51:21.958310 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/12/30 18:51:22.036927 [WARN] agent: Node name "Node 99b56933-a558-d066-fe71-e382a3eb7c69" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/12/30 18:51:22.037476 [DEBUG] tlsutil: Update with version 1
test - 2019/12/30 18:51:22.039837 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:99b56933-a558-d066-fe71-e382a3eb7c69 Address:127.0.0.1:17566}]
2019/12/30 18:51:23 [INFO]  raft: Node at 127.0.0.1:17566 [Follower] entering Follower state (Leader: "")
test - 2019/12/30 18:51:23.432903 [INFO] serf: EventMemberJoin: Node 99b56933-a558-d066-fe71-e382a3eb7c69.dc1 127.0.0.1
test - 2019/12/30 18:51:23.436519 [INFO] serf: EventMemberJoin: Node 99b56933-a558-d066-fe71-e382a3eb7c69 127.0.0.1
test - 2019/12/30 18:51:23.437331 [INFO] consul: Handled member-join event for server "Node 99b56933-a558-d066-fe71-e382a3eb7c69.dc1" in area "wan"
test - 2019/12/30 18:51:23.437622 [INFO] consul: Adding LAN server Node 99b56933-a558-d066-fe71-e382a3eb7c69 (Addr: tcp/127.0.0.1:17566) (DC: dc1)
test - 2019/12/30 18:51:23.437830 [INFO] agent: Started DNS server 127.0.0.1:17561 (udp)
test - 2019/12/30 18:51:23.438135 [INFO] agent: Started DNS server 127.0.0.1:17561 (tcp)
test - 2019/12/30 18:51:23.445867 [INFO] agent: Started HTTP server on 127.0.0.1:17562 (tcp)
test - 2019/12/30 18:51:23.445984 [INFO] agent: started state syncer
2019/12/30 18:51:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:23 [INFO]  raft: Node at 127.0.0.1:17566 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:24 [INFO]  raft: Node at 127.0.0.1:17566 [Leader] entering Leader state
test - 2019/12/30 18:51:24.345346 [INFO] consul: cluster leadership acquired
test - 2019/12/30 18:51:24.345819 [INFO] consul: New leader elected: Node 99b56933-a558-d066-fe71-e382a3eb7c69
test - 2019/12/30 18:51:24.456953 [INFO] agent: Requesting shutdown
test - 2019/12/30 18:51:24.457061 [INFO] consul: shutting down server
test - 2019/12/30 18:51:24.457127 [WARN] serf: Shutdown without a Leave
test - 2019/12/30 18:51:24.457229 [ERR] agent: failed to sync remote state: No cluster leader
test - 2019/12/30 18:51:24.594663 [WARN] serf: Shutdown without a Leave
test - 2019/12/30 18:51:24.736455 [INFO] manager: shutting down
test - 2019/12/30 18:51:24.803185 [ERR] consul: failed to wait for barrier: leadership lost while committing log
test - 2019/12/30 18:51:24.803502 [INFO] agent: consul server down
test - 2019/12/30 18:51:24.803552 [INFO] agent: shutdown complete
test - 2019/12/30 18:51:24.803609 [INFO] agent: Stopping DNS server 127.0.0.1:17561 (tcp)
test - 2019/12/30 18:51:24.803758 [INFO] agent: Stopping DNS server 127.0.0.1:17561 (udp)
test - 2019/12/30 18:51:24.803922 [INFO] agent: Stopping HTTP server 127.0.0.1:17562 (tcp)
test - 2019/12/30 18:51:24.804144 [INFO] agent: Waiting for endpoints to shut down
test - 2019/12/30 18:51:24.804215 [INFO] agent: Endpoints down
=== RUN   TestAgent_ConnectClusterIDConfig/no_cluster_ID_specified_sets_to_test_ID
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/12/30 18:51:24.910534 [WARN] agent: Node name "Node 5f3c6c05-eab9-7969-b1f2-40da3bf11631" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/12/30 18:51:24.911200 [DEBUG] tlsutil: Update with version 1
test - 2019/12/30 18:51:24.913687 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5f3c6c05-eab9-7969-b1f2-40da3bf11631 Address:127.0.0.1:17572}]
2019/12/30 18:51:25 [INFO]  raft: Node at 127.0.0.1:17572 [Follower] entering Follower state (Leader: "")
test - 2019/12/30 18:51:25.682219 [INFO] serf: EventMemberJoin: Node 5f3c6c05-eab9-7969-b1f2-40da3bf11631.dc1 127.0.0.1
test - 2019/12/30 18:51:25.695831 [INFO] serf: EventMemberJoin: Node 5f3c6c05-eab9-7969-b1f2-40da3bf11631 127.0.0.1
test - 2019/12/30 18:51:25.696421 [INFO] consul: Handled member-join event for server "Node 5f3c6c05-eab9-7969-b1f2-40da3bf11631.dc1" in area "wan"
test - 2019/12/30 18:51:25.696741 [INFO] consul: Adding LAN server Node 5f3c6c05-eab9-7969-b1f2-40da3bf11631 (Addr: tcp/127.0.0.1:17572) (DC: dc1)
test - 2019/12/30 18:51:25.697077 [INFO] agent: Started DNS server 127.0.0.1:17567 (tcp)
test - 2019/12/30 18:51:25.697159 [INFO] agent: Started DNS server 127.0.0.1:17567 (udp)
test - 2019/12/30 18:51:25.699510 [INFO] agent: Started HTTP server on 127.0.0.1:17568 (tcp)
test - 2019/12/30 18:51:25.699617 [INFO] agent: started state syncer
2019/12/30 18:51:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:25 [INFO]  raft: Node at 127.0.0.1:17572 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:26 [INFO]  raft: Node at 127.0.0.1:17572 [Leader] entering Leader state
test - 2019/12/30 18:51:26.178607 [INFO] consul: cluster leadership acquired
test - 2019/12/30 18:51:26.179063 [INFO] consul: New leader elected: Node 5f3c6c05-eab9-7969-b1f2-40da3bf11631
test - 2019/12/30 18:51:26.277981 [INFO] agent: Requesting shutdown
test - 2019/12/30 18:51:26.278083 [INFO] consul: shutting down server
test - 2019/12/30 18:51:26.278129 [WARN] serf: Shutdown without a Leave
test - 2019/12/30 18:51:26.396276 [WARN] serf: Shutdown without a Leave
test - 2019/12/30 18:51:26.536605 [INFO] manager: shutting down
test - 2019/12/30 18:51:26.586622 [ERR] consul: failed to wait for barrier: leadership lost while committing log
test - 2019/12/30 18:51:26.586860 [WARN] agent: Syncing node info failed. leadership lost while committing log
test - 2019/12/30 18:51:26.586931 [ERR] agent: failed to sync remote state: leadership lost while committing log
test - 2019/12/30 18:51:26.587005 [INFO] agent: consul server down
test - 2019/12/30 18:51:26.587054 [INFO] agent: shutdown complete
test - 2019/12/30 18:51:26.587107 [INFO] agent: Stopping DNS server 127.0.0.1:17567 (tcp)
test - 2019/12/30 18:51:26.587284 [INFO] agent: Stopping DNS server 127.0.0.1:17567 (udp)
test - 2019/12/30 18:51:26.587485 [INFO] agent: Stopping HTTP server 127.0.0.1:17568 (tcp)
test - 2019/12/30 18:51:26.587765 [INFO] agent: Waiting for endpoints to shut down
test - 2019/12/30 18:51:26.587850 [INFO] agent: Endpoints down
=== RUN   TestAgent_ConnectClusterIDConfig/non-UUID_cluster_id_is_fatal
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/12/30 18:51:26.649278 [WARN] agent: Node name "Node f05d82d7-c0db-8947-2dcb-5418f2138406" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/12/30 18:51:26.649656 [ERR] connect CA config cluster_id specified but is not a valid UUID, aborting startup
--- PASS: TestAgent_ConnectClusterIDConfig (4.69s)
    --- PASS: TestAgent_ConnectClusterIDConfig/default_TestAgent_has_fixed_cluster_id (2.85s)
    --- PASS: TestAgent_ConnectClusterIDConfig/no_cluster_ID_specified_sets_to_test_ID (1.78s)
    --- PASS: TestAgent_ConnectClusterIDConfig/non-UUID_cluster_id_is_fatal (0.06s)
=== RUN   TestAgent_StartStop
=== PAUSE TestAgent_StartStop
=== RUN   TestAgent_RPCPing
=== PAUSE TestAgent_RPCPing
=== RUN   TestAgent_TokenStore
=== PAUSE TestAgent_TokenStore
=== RUN   TestAgent_ReconnectConfigSettings
=== PAUSE TestAgent_ReconnectConfigSettings
=== RUN   TestAgent_ReconnectConfigWanDisabled
=== PAUSE TestAgent_ReconnectConfigWanDisabled
=== RUN   TestAgent_setupNodeID
=== PAUSE TestAgent_setupNodeID
=== RUN   TestAgent_makeNodeID
=== PAUSE TestAgent_makeNodeID
=== RUN   TestAgent_AddService
=== PAUSE TestAgent_AddService
=== RUN   TestAgent_AddServiceNoExec
=== PAUSE TestAgent_AddServiceNoExec
=== RUN   TestAgent_AddServiceNoRemoteExec
=== PAUSE TestAgent_AddServiceNoRemoteExec
=== RUN   TestAgent_RemoveService
=== PAUSE TestAgent_RemoveService
=== RUN   TestAgent_RemoveServiceRemovesAllChecks
=== PAUSE TestAgent_RemoveServiceRemovesAllChecks
=== RUN   TestAgent_IndexChurn
=== PAUSE TestAgent_IndexChurn
=== RUN   TestAgent_AddCheck
=== PAUSE TestAgent_AddCheck
=== RUN   TestAgent_AddCheck_StartPassing
=== PAUSE TestAgent_AddCheck_StartPassing
=== RUN   TestAgent_AddCheck_MinInterval
=== PAUSE TestAgent_AddCheck_MinInterval
=== RUN   TestAgent_AddCheck_MissingService
=== PAUSE TestAgent_AddCheck_MissingService
=== RUN   TestAgent_AddCheck_RestoreState
=== PAUSE TestAgent_AddCheck_RestoreState
=== RUN   TestAgent_AddCheck_ExecDisable
=== PAUSE TestAgent_AddCheck_ExecDisable
=== RUN   TestAgent_AddCheck_ExecRemoteDisable
=== PAUSE TestAgent_AddCheck_ExecRemoteDisable
=== RUN   TestAgent_AddCheck_GRPC
=== PAUSE TestAgent_AddCheck_GRPC
=== RUN   TestAgent_RestoreServiceWithAliasCheck
--- SKIP: TestAgent_RestoreServiceWithAliasCheck (0.00s)
    agent_test.go:1149: skipping slow test; set SLOWTEST=1 to run
=== RUN   TestAgent_AddCheck_Alias
=== PAUSE TestAgent_AddCheck_Alias
=== RUN   TestAgent_AddCheck_Alias_setToken
=== PAUSE TestAgent_AddCheck_Alias_setToken
=== RUN   TestAgent_AddCheck_Alias_userToken
=== PAUSE TestAgent_AddCheck_Alias_userToken
=== RUN   TestAgent_AddCheck_Alias_userAndSetToken
=== PAUSE TestAgent_AddCheck_Alias_userAndSetToken
=== RUN   TestAgent_RemoveCheck
=== PAUSE TestAgent_RemoveCheck
=== RUN   TestAgent_HTTPCheck_TLSSkipVerify
=== PAUSE TestAgent_HTTPCheck_TLSSkipVerify
=== RUN   TestAgent_HTTPCheck_EnableAgentTLSForChecks
--- SKIP: TestAgent_HTTPCheck_EnableAgentTLSForChecks (0.00s)
    agent_test.go:1521: DM-skipped
=== RUN   TestAgent_updateTTLCheck
=== PAUSE TestAgent_updateTTLCheck
=== RUN   TestAgent_PersistService
=== PAUSE TestAgent_PersistService
=== RUN   TestAgent_persistedService_compat
=== PAUSE TestAgent_persistedService_compat
=== RUN   TestAgent_PurgeService
=== PAUSE TestAgent_PurgeService
=== RUN   TestAgent_PurgeServiceOnDuplicate
=== PAUSE TestAgent_PurgeServiceOnDuplicate
=== RUN   TestAgent_PersistProxy
=== PAUSE TestAgent_PersistProxy
=== RUN   TestAgent_PurgeProxy
=== PAUSE TestAgent_PurgeProxy
=== RUN   TestAgent_PurgeProxyOnDuplicate
=== PAUSE TestAgent_PurgeProxyOnDuplicate
=== RUN   TestAgent_PersistCheck
=== PAUSE TestAgent_PersistCheck
=== RUN   TestAgent_PurgeCheck
--- SKIP: TestAgent_PurgeCheck (0.00s)
    agent_test.go:2146: DM-skipped
=== RUN   TestAgent_PurgeCheckOnDuplicate
=== PAUSE TestAgent_PurgeCheckOnDuplicate
=== RUN   TestAgent_loadChecks_token
=== PAUSE TestAgent_loadChecks_token
=== RUN   TestAgent_unloadChecks
=== PAUSE TestAgent_unloadChecks
=== RUN   TestAgent_loadServices_token
=== PAUSE TestAgent_loadServices_token
=== RUN   TestAgent_loadServices_sidecar
=== PAUSE TestAgent_loadServices_sidecar
=== RUN   TestAgent_loadServices_sidecarSeparateToken
=== PAUSE TestAgent_loadServices_sidecarSeparateToken
=== RUN   TestAgent_loadServices_sidecarInheritMeta
=== PAUSE TestAgent_loadServices_sidecarInheritMeta
=== RUN   TestAgent_loadServices_sidecarOverrideMeta
=== PAUSE TestAgent_loadServices_sidecarOverrideMeta
=== RUN   TestAgent_unloadServices
=== PAUSE TestAgent_unloadServices
=== RUN   TestAgent_loadProxies
=== PAUSE TestAgent_loadProxies
=== RUN   TestAgent_loadProxies_nilProxy
=== PAUSE TestAgent_loadProxies_nilProxy
=== RUN   TestAgent_unloadProxies
=== PAUSE TestAgent_unloadProxies
=== RUN   TestAgent_Service_MaintenanceMode
=== PAUSE TestAgent_Service_MaintenanceMode
=== RUN   TestAgent_Service_Reap
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_Reap - 2019/12/30 18:51:26.719176 [WARN] agent: Node name "Node 479c9a40-d285-9e68-596f-a4b76a406d94" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_Reap - 2019/12/30 18:51:26.719803 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_Reap - 2019/12/30 18:51:26.722003 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:479c9a40-d285-9e68-596f-a4b76a406d94 Address:127.0.0.1:17584}]
2019/12/30 18:51:27 [INFO]  raft: Node at 127.0.0.1:17584 [Follower] entering Follower state (Leader: "")
TestAgent_Service_Reap - 2019/12/30 18:51:27.432408 [INFO] serf: EventMemberJoin: Node 479c9a40-d285-9e68-596f-a4b76a406d94.dc1 127.0.0.1
TestAgent_Service_Reap - 2019/12/30 18:51:27.435936 [INFO] serf: EventMemberJoin: Node 479c9a40-d285-9e68-596f-a4b76a406d94 127.0.0.1
TestAgent_Service_Reap - 2019/12/30 18:51:27.436765 [INFO] consul: Handled member-join event for server "Node 479c9a40-d285-9e68-596f-a4b76a406d94.dc1" in area "wan"
TestAgent_Service_Reap - 2019/12/30 18:51:27.437349 [INFO] agent: Started DNS server 127.0.0.1:17579 (udp)
TestAgent_Service_Reap - 2019/12/30 18:51:27.437707 [INFO] agent: Started DNS server 127.0.0.1:17579 (tcp)
TestAgent_Service_Reap - 2019/12/30 18:51:27.437785 [INFO] consul: Adding LAN server Node 479c9a40-d285-9e68-596f-a4b76a406d94 (Addr: tcp/127.0.0.1:17584) (DC: dc1)
TestAgent_Service_Reap - 2019/12/30 18:51:27.440181 [INFO] agent: Started HTTP server on 127.0.0.1:17580 (tcp)
TestAgent_Service_Reap - 2019/12/30 18:51:27.440323 [INFO] agent: started state syncer
2019/12/30 18:51:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:27 [INFO]  raft: Node at 127.0.0.1:17584 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:27 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:27 [INFO]  raft: Node at 127.0.0.1:17584 [Leader] entering Leader state
TestAgent_Service_Reap - 2019/12/30 18:51:27.895282 [INFO] consul: cluster leadership acquired
TestAgent_Service_Reap - 2019/12/30 18:51:27.895725 [INFO] consul: New leader elected: Node 479c9a40-d285-9e68-596f-a4b76a406d94
TestAgent_Service_Reap - 2019/12/30 18:51:28.491585 [INFO] agent: Synced node info
TestAgent_Service_Reap - 2019/12/30 18:51:28.492656 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:28.752075 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.178760 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_Service_Reap - 2019/12/30 18:51:29.179335 [DEBUG] consul: Skipping self join check for "Node 479c9a40-d285-9e68-596f-a4b76a406d94" since the cluster is too small
TestAgent_Service_Reap - 2019/12/30 18:51:29.179555 [INFO] consul: member 'Node 479c9a40-d285-9e68-596f-a4b76a406d94' joined, marking health alive
TestAgent_Service_Reap - 2019/12/30 18:51:29.360487 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_Reap - 2019/12/30 18:51:29.488673 [INFO] agent: Synced service "redis"
TestAgent_Service_Reap - 2019/12/30 18:51:29.488767 [DEBUG] agent: Check "service:redis" in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.488808 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.489019 [DEBUG] agent: Check "service:redis" status is now passing
TestAgent_Service_Reap - 2019/12/30 18:51:29.489050 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.654822 [INFO] agent: Synced check "service:redis"
TestAgent_Service_Reap - 2019/12/30 18:51:29.654900 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.656728 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.680383 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_Reap - 2019/12/30 18:51:29.813041 [INFO] agent: Synced check "service:redis"
TestAgent_Service_Reap - 2019/12/30 18:51:29.813117 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.813352 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_Reap - 2019/12/30 18:51:29.988180 [INFO] agent: Synced check "service:redis"
TestAgent_Service_Reap - 2019/12/30 18:51:29.988267 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:30.213775 [INFO] agent: Deregistered service "redis"
TestAgent_Service_Reap - 2019/12/30 18:51:30.380136 [INFO] agent: Deregistered check "service:redis"
TestAgent_Service_Reap - 2019/12/30 18:51:30.380218 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:30.380372 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/30 18:51:30.380478 [INFO] agent: Requesting shutdown
TestAgent_Service_Reap - 2019/12/30 18:51:30.381176 [DEBUG] agent: removed check "service:redis"
TestAgent_Service_Reap - 2019/12/30 18:51:30.381242 [DEBUG] agent: removed service "redis"
TestAgent_Service_Reap - 2019/12/30 18:51:30.381316 [INFO] agent: Check "service:redis" for service "redis" has been critical for too long; deregistered service
TestAgent_Service_Reap - 2019/12/30 18:51:30.381403 [INFO] consul: shutting down server
TestAgent_Service_Reap - 2019/12/30 18:51:30.381452 [WARN] serf: Shutdown without a Leave
TestAgent_Service_Reap - 2019/12/30 18:51:30.446752 [WARN] serf: Shutdown without a Leave
TestAgent_Service_Reap - 2019/12/30 18:51:30.503278 [INFO] manager: shutting down
TestAgent_Service_Reap - 2019/12/30 18:51:30.503795 [INFO] agent: consul server down
TestAgent_Service_Reap - 2019/12/30 18:51:30.503854 [INFO] agent: shutdown complete
TestAgent_Service_Reap - 2019/12/30 18:51:30.503910 [INFO] agent: Stopping DNS server 127.0.0.1:17579 (tcp)
TestAgent_Service_Reap - 2019/12/30 18:51:30.504064 [INFO] agent: Stopping DNS server 127.0.0.1:17579 (udp)
TestAgent_Service_Reap - 2019/12/30 18:51:30.504230 [INFO] agent: Stopping HTTP server 127.0.0.1:17580 (tcp)
TestAgent_Service_Reap - 2019/12/30 18:51:30.504561 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_Reap - 2019/12/30 18:51:30.504653 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_Reap (3.85s)
=== RUN   TestAgent_Service_NoReap
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_NoReap - 2019/12/30 18:51:30.569591 [WARN] agent: Node name "Node fa892919-f1f5-f0ed-4358-8bcb711a792b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_NoReap - 2019/12/30 18:51:30.570071 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_NoReap - 2019/12/30 18:51:30.572433 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fa892919-f1f5-f0ed-4358-8bcb711a792b Address:127.0.0.1:17590}]
2019/12/30 18:51:31 [INFO]  raft: Node at 127.0.0.1:17590 [Follower] entering Follower state (Leader: "")
TestAgent_Service_NoReap - 2019/12/30 18:51:31.315483 [INFO] serf: EventMemberJoin: Node fa892919-f1f5-f0ed-4358-8bcb711a792b.dc1 127.0.0.1
TestAgent_Service_NoReap - 2019/12/30 18:51:31.320349 [INFO] serf: EventMemberJoin: Node fa892919-f1f5-f0ed-4358-8bcb711a792b 127.0.0.1
TestAgent_Service_NoReap - 2019/12/30 18:51:31.321923 [INFO] consul: Adding LAN server Node fa892919-f1f5-f0ed-4358-8bcb711a792b (Addr: tcp/127.0.0.1:17590) (DC: dc1)
TestAgent_Service_NoReap - 2019/12/30 18:51:31.322068 [INFO] consul: Handled member-join event for server "Node fa892919-f1f5-f0ed-4358-8bcb711a792b.dc1" in area "wan"
TestAgent_Service_NoReap - 2019/12/30 18:51:31.323352 [INFO] agent: Started DNS server 127.0.0.1:17585 (tcp)
TestAgent_Service_NoReap - 2019/12/30 18:51:31.323823 [INFO] agent: Started DNS server 127.0.0.1:17585 (udp)
TestAgent_Service_NoReap - 2019/12/30 18:51:31.326103 [INFO] agent: Started HTTP server on 127.0.0.1:17586 (tcp)
TestAgent_Service_NoReap - 2019/12/30 18:51:31.326205 [INFO] agent: started state syncer
2019/12/30 18:51:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:31 [INFO]  raft: Node at 127.0.0.1:17590 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:31 [INFO]  raft: Node at 127.0.0.1:17590 [Leader] entering Leader state
TestAgent_Service_NoReap - 2019/12/30 18:51:31.826230 [INFO] consul: cluster leadership acquired
TestAgent_Service_NoReap - 2019/12/30 18:51:31.826699 [INFO] consul: New leader elected: Node fa892919-f1f5-f0ed-4358-8bcb711a792b
TestAgent_Service_NoReap - 2019/12/30 18:51:31.855166 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_NoReap - 2019/12/30 18:51:32.138598 [INFO] agent: Synced service "redis"
TestAgent_Service_NoReap - 2019/12/30 18:51:32.138698 [DEBUG] agent: Check "service:redis" in sync
TestAgent_Service_NoReap - 2019/12/30 18:51:32.138736 [DEBUG] agent: Node info in sync
TestAgent_Service_NoReap - 2019/12/30 18:51:32.339234 [INFO] agent: Requesting shutdown
TestAgent_Service_NoReap - 2019/12/30 18:51:32.339352 [INFO] consul: shutting down server
TestAgent_Service_NoReap - 2019/12/30 18:51:32.339457 [WARN] serf: Shutdown without a Leave
TestAgent_Service_NoReap - 2019/12/30 18:51:32.453226 [WARN] serf: Shutdown without a Leave
TestAgent_Service_NoReap - 2019/12/30 18:51:32.594968 [INFO] manager: shutting down
TestAgent_Service_NoReap - 2019/12/30 18:51:32.620947 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662032 [INFO] agent: consul server down
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662130 [INFO] agent: shutdown complete
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662207 [INFO] agent: Stopping DNS server 127.0.0.1:17585 (tcp)
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662381 [INFO] agent: Stopping DNS server 127.0.0.1:17585 (udp)
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662569 [INFO] agent: Stopping HTTP server 127.0.0.1:17586 (tcp)
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662844 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_NoReap - 2019/12/30 18:51:32.662934 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_NoReap (2.16s)
=== RUN   TestAgent_AddService_restoresSnapshot
=== PAUSE TestAgent_AddService_restoresSnapshot
=== RUN   TestAgent_AddCheck_restoresSnapshot
=== PAUSE TestAgent_AddCheck_restoresSnapshot
=== RUN   TestAgent_NodeMaintenanceMode
=== PAUSE TestAgent_NodeMaintenanceMode
=== RUN   TestAgent_checkStateSnapshot
=== PAUSE TestAgent_checkStateSnapshot
=== RUN   TestAgent_loadChecks_checkFails
=== PAUSE TestAgent_loadChecks_checkFails
=== RUN   TestAgent_persistCheckState
=== PAUSE TestAgent_persistCheckState
=== RUN   TestAgent_loadCheckState
=== PAUSE TestAgent_loadCheckState
=== RUN   TestAgent_purgeCheckState
=== PAUSE TestAgent_purgeCheckState
=== RUN   TestAgent_GetCoordinate
=== PAUSE TestAgent_GetCoordinate
=== RUN   TestAgent_reloadWatches
=== PAUSE TestAgent_reloadWatches
TestAgent_Service_NoReap - 2019/12/30 18:51:32.664608 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_Service_NoReap - 2019/12/30 18:51:32.664830 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_Service_NoReap - 2019/12/30 18:51:32.664914 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
=== RUN   TestAgent_reloadWatchesHTTPS
=== PAUSE TestAgent_reloadWatchesHTTPS
=== RUN   TestAgent_AddProxy
--- SKIP: TestAgent_AddProxy (0.00s)
    agent_test.go:3265: DM-skipped
=== RUN   TestAgent_RemoveProxy
=== PAUSE TestAgent_RemoveProxy
=== RUN   TestAgent_ReLoadProxiesFromConfig
=== PAUSE TestAgent_ReLoadProxiesFromConfig
=== RUN   TestAgent_SetupProxyManager
=== PAUSE TestAgent_SetupProxyManager
=== RUN   TestAgent_loadTokens
=== PAUSE TestAgent_loadTokens
=== RUN   TestAgent_ReloadConfigOutgoingRPCConfig
=== PAUSE TestAgent_ReloadConfigOutgoingRPCConfig
=== RUN   TestAgent_ReloadConfigIncomingRPCConfig
=== PAUSE TestAgent_ReloadConfigIncomingRPCConfig
=== RUN   TestAgent_ReloadConfigTLSConfigFailure
=== PAUSE TestAgent_ReloadConfigTLSConfigFailure
=== RUN   TestAgent_consulConfig
=== PAUSE TestAgent_consulConfig
=== RUN   TestBlacklist
=== PAUSE TestBlacklist
=== RUN   TestCatalogRegister_Service_InvalidAddress
=== PAUSE TestCatalogRegister_Service_InvalidAddress
=== RUN   TestCatalogDeregister
=== PAUSE TestCatalogDeregister
=== RUN   TestCatalogDatacenters
=== PAUSE TestCatalogDatacenters
=== RUN   TestCatalogNodes
=== PAUSE TestCatalogNodes
=== RUN   TestCatalogNodes_MetaFilter
=== PAUSE TestCatalogNodes_MetaFilter
=== RUN   TestCatalogNodes_Filter
=== PAUSE TestCatalogNodes_Filter
=== RUN   TestCatalogNodes_WanTranslation
--- SKIP: TestCatalogNodes_WanTranslation (0.00s)
    catalog_endpoint_test.go:194: DM-skipped
=== RUN   TestCatalogNodes_Blocking
=== PAUSE TestCatalogNodes_Blocking
=== RUN   TestCatalogNodes_DistanceSort
=== PAUSE TestCatalogNodes_DistanceSort
=== RUN   TestCatalogServices
=== PAUSE TestCatalogServices
=== RUN   TestCatalogServices_NodeMetaFilter
=== PAUSE TestCatalogServices_NodeMetaFilter
=== RUN   TestCatalogServiceNodes
=== PAUSE TestCatalogServiceNodes
=== RUN   TestCatalogServiceNodes_NodeMetaFilter
=== PAUSE TestCatalogServiceNodes_NodeMetaFilter
=== RUN   TestCatalogServiceNodes_Filter
=== PAUSE TestCatalogServiceNodes_Filter
=== RUN   TestCatalogServiceNodes_WanTranslation
--- SKIP: TestCatalogServiceNodes_WanTranslation (0.00s)
    catalog_endpoint_test.go:756: DM-skipped
=== RUN   TestCatalogServiceNodes_DistanceSort
=== PAUSE TestCatalogServiceNodes_DistanceSort
=== RUN   TestCatalogServiceNodes_ConnectProxy
=== PAUSE TestCatalogServiceNodes_ConnectProxy
=== RUN   TestCatalogConnectServiceNodes_good
=== PAUSE TestCatalogConnectServiceNodes_good
=== RUN   TestCatalogConnectServiceNodes_Filter
=== PAUSE TestCatalogConnectServiceNodes_Filter
=== RUN   TestCatalogNodeServices
=== PAUSE TestCatalogNodeServices
=== RUN   TestCatalogNodeServices_Filter
=== PAUSE TestCatalogNodeServices_Filter
=== RUN   TestCatalogNodeServices_ConnectProxy
=== PAUSE TestCatalogNodeServices_ConnectProxy
=== RUN   TestCatalogNodeServices_WanTranslation
--- SKIP: TestCatalogNodeServices_WanTranslation (0.00s)
    catalog_endpoint_test.go:1136: DM-skipped
=== RUN   TestConfig_Get
=== PAUSE TestConfig_Get
=== RUN   TestConfig_Delete
=== PAUSE TestConfig_Delete
=== RUN   TestConfig_Apply
=== PAUSE TestConfig_Apply
=== RUN   TestConfig_Apply_CAS
=== PAUSE TestConfig_Apply_CAS
=== RUN   TestConfig_Apply_Decoding
=== PAUSE TestConfig_Apply_Decoding
=== RUN   TestConnectCARoots_empty
=== PAUSE TestConnectCARoots_empty
=== RUN   TestConnectCARoots_list
=== PAUSE TestConnectCARoots_list
=== RUN   TestConnectCAConfig
=== PAUSE TestConnectCAConfig
=== RUN   TestCoordinate_Disabled_Response
=== PAUSE TestCoordinate_Disabled_Response
=== RUN   TestCoordinate_Datacenters
--- SKIP: TestCoordinate_Datacenters (0.00s)
    coordinate_endpoint_test.go:54: DM-skipped
=== RUN   TestCoordinate_Nodes
=== PAUSE TestCoordinate_Nodes
=== RUN   TestCoordinate_Node
=== PAUSE TestCoordinate_Node
=== RUN   TestCoordinate_Update
=== PAUSE TestCoordinate_Update
=== RUN   TestCoordinate_Update_ACLDeny
=== PAUSE TestCoordinate_Update_ACLDeny
=== RUN   TestRecursorAddr
=== PAUSE TestRecursorAddr
=== RUN   TestEncodeKVasRFC1464
--- PASS: TestEncodeKVasRFC1464 (0.00s)
=== RUN   TestDNS_Over_TCP
=== PAUSE TestDNS_Over_TCP
=== RUN   TestDNS_NodeLookup
--- SKIP: TestDNS_NodeLookup (0.00s)
    dns_test.go:178: DM-skipped
=== RUN   TestDNS_CaseInsensitiveNodeLookup
=== PAUSE TestDNS_CaseInsensitiveNodeLookup
=== RUN   TestDNS_NodeLookup_PeriodName
=== PAUSE TestDNS_NodeLookup_PeriodName
=== RUN   TestDNS_NodeLookup_AAAA
=== PAUSE TestDNS_NodeLookup_AAAA
=== RUN   TestDNSCycleRecursorCheck
=== PAUSE TestDNSCycleRecursorCheck
=== RUN   TestDNSCycleRecursorCheckAllFail
--- SKIP: TestDNSCycleRecursorCheckAllFail (0.00s)
    dns_test.go:423: DM-skipped
=== RUN   TestDNS_NodeLookup_CNAME
=== PAUSE TestDNS_NodeLookup_CNAME
=== RUN   TestDNS_NodeLookup_TXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:32.823858 [WARN] agent: Node name "Node 030c9153-3029-aa30-aa98-da4955c57b57" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:32.824576 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:32.837011 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:030c9153-3029-aa30-aa98-da4955c57b57 Address:127.0.0.1:17596}]
2019/12/30 18:51:34 [INFO]  raft: Node at 127.0.0.1:17596 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.192251 [INFO] serf: EventMemberJoin: Node 030c9153-3029-aa30-aa98-da4955c57b57.dc1 127.0.0.1
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.196173 [INFO] serf: EventMemberJoin: Node 030c9153-3029-aa30-aa98-da4955c57b57 127.0.0.1
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.197225 [INFO] consul: Adding LAN server Node 030c9153-3029-aa30-aa98-da4955c57b57 (Addr: tcp/127.0.0.1:17596) (DC: dc1)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.197666 [INFO] consul: Handled member-join event for server "Node 030c9153-3029-aa30-aa98-da4955c57b57.dc1" in area "wan"
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.198903 [INFO] agent: Started DNS server 127.0.0.1:17591 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.198982 [INFO] agent: Started DNS server 127.0.0.1:17591 (udp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.201445 [INFO] agent: Started HTTP server on 127.0.0.1:17592 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.201544 [INFO] agent: started state syncer
2019/12/30 18:51:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:34 [INFO]  raft: Node at 127.0.0.1:17596 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:34 [INFO]  raft: Node at 127.0.0.1:17596 [Leader] entering Leader state
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.837074 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:34.837538 [INFO] consul: New leader elected: Node 030c9153-3029-aa30-aa98-da4955c57b57
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:35.187767 [INFO] agent: Synced node info
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.333132 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.333266 [INFO] consul: shutting down server
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.333329 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.333350 [DEBUG] dns: request for name google.node.consul. type TXT class IN (took 1.022027ms) from client 127.0.0.1:53839 (udp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.403340 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.478419 [INFO] manager: shutting down
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.554165 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.554815 [INFO] agent: consul server down
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.554882 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.554938 [INFO] agent: Stopping DNS server 127.0.0.1:17591 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.555084 [INFO] agent: Stopping DNS server 127.0.0.1:17591 (udp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.555251 [INFO] agent: Stopping HTTP server 127.0.0.1:17592 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.555461 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TXT - 2019/12/30 18:51:36.555536 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TXT (3.88s)
=== RUN   TestDNS_NodeLookup_TXT_DontSuppress
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:36.618540 [WARN] agent: Node name "Node 50acf576-8bd2-8eae-7048-ffe87caddfbe" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:36.619177 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:36.621765 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:50acf576-8bd2-8eae-7048-ffe87caddfbe Address:127.0.0.1:17602}]
2019/12/30 18:51:37 [INFO]  raft: Node at 127.0.0.1:17602 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.391783 [INFO] serf: EventMemberJoin: Node 50acf576-8bd2-8eae-7048-ffe87caddfbe.dc1 127.0.0.1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.399004 [INFO] serf: EventMemberJoin: Node 50acf576-8bd2-8eae-7048-ffe87caddfbe 127.0.0.1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.400550 [INFO] consul: Adding LAN server Node 50acf576-8bd2-8eae-7048-ffe87caddfbe (Addr: tcp/127.0.0.1:17602) (DC: dc1)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.401117 [INFO] consul: Handled member-join event for server "Node 50acf576-8bd2-8eae-7048-ffe87caddfbe.dc1" in area "wan"
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.402437 [INFO] agent: Started DNS server 127.0.0.1:17597 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.402948 [INFO] agent: Started DNS server 127.0.0.1:17597 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.405287 [INFO] agent: Started HTTP server on 127.0.0.1:17598 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.405382 [INFO] agent: started state syncer
2019/12/30 18:51:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:37 [INFO]  raft: Node at 127.0.0.1:17602 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:37 [INFO]  raft: Node at 127.0.0.1:17602 [Leader] entering Leader state
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.920378 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:37.920801 [INFO] consul: New leader elected: Node 50acf576-8bd2-8eae-7048-ffe87caddfbe
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:38.287484 [INFO] agent: Synced node info
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:38.872095 [DEBUG] dns: request for name google.node.consul. type TXT class IN (took 602.016µs) from client 127.0.0.1:40158 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:38.872193 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:38.872276 [INFO] consul: shutting down server
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:38.872335 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:38.953437 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.028425 [INFO] manager: shutting down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029139 [INFO] agent: consul server down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029192 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029246 [INFO] agent: Stopping DNS server 127.0.0.1:17597 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029451 [INFO] agent: Stopping DNS server 127.0.0.1:17597 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029611 [INFO] agent: Stopping HTTP server 127.0.0.1:17598 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029836 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.029917 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TXT_DontSuppress (2.47s)
=== RUN   TestDNS_NodeLookup_ANY
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.044317 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/30 18:51:39.044962 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.135571 [WARN] agent: Node name "Node 62e518fa-5d45-601b-5b49-e8d161287ba5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.147401 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.153262 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:62e518fa-5d45-601b-5b49-e8d161287ba5 Address:127.0.0.1:17608}]
2019/12/30 18:51:39 [INFO]  raft: Node at 127.0.0.1:17608 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.883380 [INFO] serf: EventMemberJoin: Node 62e518fa-5d45-601b-5b49-e8d161287ba5.dc1 127.0.0.1
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.887438 [INFO] serf: EventMemberJoin: Node 62e518fa-5d45-601b-5b49-e8d161287ba5 127.0.0.1
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.888313 [INFO] consul: Adding LAN server Node 62e518fa-5d45-601b-5b49-e8d161287ba5 (Addr: tcp/127.0.0.1:17608) (DC: dc1)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.889075 [INFO] consul: Handled member-join event for server "Node 62e518fa-5d45-601b-5b49-e8d161287ba5.dc1" in area "wan"
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.890420 [INFO] agent: Started DNS server 127.0.0.1:17603 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.890964 [INFO] agent: Started DNS server 127.0.0.1:17603 (udp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.893241 [INFO] agent: Started HTTP server on 127.0.0.1:17604 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:39.893314 [INFO] agent: started state syncer
2019/12/30 18:51:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:39 [INFO]  raft: Node at 127.0.0.1:17608 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:40 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:40 [INFO]  raft: Node at 127.0.0.1:17608 [Leader] entering Leader state
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:40.378847 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:40.379280 [INFO] consul: New leader elected: Node 62e518fa-5d45-601b-5b49-e8d161287ba5
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:40.695958 [INFO] agent: Synced node info
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.023332 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 747.02µs) from client 127.0.0.1:47541 (udp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.023535 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.023633 [INFO] consul: shutting down server
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.023688 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.141552 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.141668 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.320062 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.636864 [INFO] manager: shutting down
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.637277 [INFO] agent: consul server down
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.637332 [INFO] agent: shutdown complete
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.637389 [INFO] agent: Stopping DNS server 127.0.0.1:17603 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.637544 [INFO] agent: Stopping DNS server 127.0.0.1:17603 (udp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.637727 [INFO] agent: Stopping HTTP server 127.0.0.1:17604 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.637987 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_ANY - 2019/12/30 18:51:41.638067 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_ANY (2.61s)
=== RUN   TestDNS_NodeLookup_ANY_DontSuppressTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:41.696010 [WARN] agent: Node name "Node c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:41.696402 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:41.698478 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3 Address:127.0.0.1:17614}]
2019/12/30 18:51:42 [INFO]  raft: Node at 127.0.0.1:17614 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.885193 [INFO] serf: EventMemberJoin: Node c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3.dc1 127.0.0.1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.889343 [INFO] serf: EventMemberJoin: Node c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3 127.0.0.1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.891079 [INFO] consul: Adding LAN server Node c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3 (Addr: tcp/127.0.0.1:17614) (DC: dc1)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.891542 [INFO] consul: Handled member-join event for server "Node c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3.dc1" in area "wan"
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.892846 [INFO] agent: Started DNS server 127.0.0.1:17609 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.893353 [INFO] agent: Started DNS server 127.0.0.1:17609 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.895950 [INFO] agent: Started HTTP server on 127.0.0.1:17610 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:42.896051 [INFO] agent: started state syncer
2019/12/30 18:51:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:42 [INFO]  raft: Node at 127.0.0.1:17614 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:43 [INFO]  raft: Node at 127.0.0.1:17614 [Leader] entering Leader state
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:43.662245 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:43.662720 [INFO] consul: New leader elected: Node c2e44bf1-f8e1-b8aa-81de-9cc7e3a69dd3
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:44.004466 [INFO] agent: Synced node info
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:45.204765 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 852.023µs) from client 127.0.0.1:40817 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:45.205108 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:45.205263 [INFO] consul: shutting down server
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:45.205367 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.333501 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.333615 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.429294 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.503564 [INFO] manager: shutting down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.504294 [INFO] agent: consul server down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.504359 [INFO] agent: shutdown complete
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.504491 [INFO] agent: Stopping DNS server 127.0.0.1:17609 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.504650 [INFO] agent: Stopping DNS server 127.0.0.1:17609 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.504812 [INFO] agent: Stopping HTTP server 127.0.0.1:17610 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.505023 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.505101 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_ANY_DontSuppressTXT (4.87s)
=== RUN   TestDNS_NodeLookup_A_SuppressTXT
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/30 18:51:46.543921 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:46.624800 [WARN] agent: Node name "Node 7f3d86f6-95d6-34dc-545c-2bdb2721d2bd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:46.625478 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:46.628035 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7f3d86f6-95d6-34dc-545c-2bdb2721d2bd Address:127.0.0.1:17620}]
2019/12/30 18:51:47 [INFO]  raft: Node at 127.0.0.1:17620 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.493127 [INFO] serf: EventMemberJoin: Node 7f3d86f6-95d6-34dc-545c-2bdb2721d2bd.dc1 127.0.0.1
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.512770 [INFO] serf: EventMemberJoin: Node 7f3d86f6-95d6-34dc-545c-2bdb2721d2bd 127.0.0.1
2019/12/30 18:51:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:47 [INFO]  raft: Node at 127.0.0.1:17620 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.514583 [INFO] agent: Started DNS server 127.0.0.1:17615 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.515452 [INFO] consul: Adding LAN server Node 7f3d86f6-95d6-34dc-545c-2bdb2721d2bd (Addr: tcp/127.0.0.1:17620) (DC: dc1)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.515795 [INFO] consul: Handled member-join event for server "Node 7f3d86f6-95d6-34dc-545c-2bdb2721d2bd.dc1" in area "wan"
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.516546 [INFO] agent: Started DNS server 127.0.0.1:17615 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.521356 [INFO] agent: Started HTTP server on 127.0.0.1:17616 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.521473 [INFO] agent: started state syncer
2019/12/30 18:51:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:47 [INFO]  raft: Node at 127.0.0.1:17620 [Leader] entering Leader state
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.954640 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:47.955209 [INFO] consul: New leader elected: Node 7f3d86f6-95d6-34dc-545c-2bdb2721d2bd
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.296075 [INFO] agent: Synced node info
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.296195 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.633673 [DEBUG] dns: request for name bar.node.consul. type A class IN (took 467.679µs) from client 127.0.0.1:38338 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.633951 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.634058 [INFO] consul: shutting down server
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.634115 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.761898 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.837000 [INFO] manager: shutting down
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.878822 [INFO] agent: consul server down
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.878893 [INFO] agent: shutdown complete
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.878952 [INFO] agent: Stopping DNS server 127.0.0.1:17615 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.879086 [INFO] agent: Stopping DNS server 127.0.0.1:17615 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.879236 [INFO] agent: Stopping HTTP server 127.0.0.1:17616 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.879489 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.879567 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_A_SuppressTXT (2.37s)
=== RUN   TestDNS_EDNS0
=== PAUSE TestDNS_EDNS0
=== RUN   TestDNS_EDNS0_ECS
=== PAUSE TestDNS_EDNS0_ECS
=== RUN   TestDNS_ReverseLookup
=== PAUSE TestDNS_ReverseLookup
=== RUN   TestDNS_ReverseLookup_CustomDomain
=== PAUSE TestDNS_ReverseLookup_CustomDomain
=== RUN   TestDNS_ReverseLookup_IPV6
=== PAUSE TestDNS_ReverseLookup_IPV6
=== RUN   TestDNS_ServiceReverseLookup
--- SKIP: TestDNS_ServiceReverseLookup (0.00s)
    dns_test.go:976: DM-skipped
=== RUN   TestDNS_ServiceReverseLookup_IPV6
=== PAUSE TestDNS_ServiceReverseLookup_IPV6
=== RUN   TestDNS_ServiceReverseLookup_CustomDomain
=== PAUSE TestDNS_ServiceReverseLookup_CustomDomain
=== RUN   TestDNS_SOA_Settings
=== PAUSE TestDNS_SOA_Settings
=== RUN   TestDNS_ServiceReverseLookupNodeAddress
=== PAUSE TestDNS_ServiceReverseLookupNodeAddress
=== RUN   TestDNS_ServiceLookupNoMultiCNAME
--- SKIP: TestDNS_ServiceLookupNoMultiCNAME (0.00s)
    dns_test.go:1204: DM-skipped
=== RUN   TestDNS_ServiceLookupPreferNoCNAME
=== PAUSE TestDNS_ServiceLookupPreferNoCNAME
=== RUN   TestDNS_ServiceLookupMultiAddrNoCNAME
=== PAUSE TestDNS_ServiceLookupMultiAddrNoCNAME
=== RUN   TestDNS_ServiceLookup
=== PAUSE TestDNS_ServiceLookup
=== RUN   TestDNS_ServiceLookupWithInternalServiceAddress
=== PAUSE TestDNS_ServiceLookupWithInternalServiceAddress
=== RUN   TestDNS_ConnectServiceLookup
=== PAUSE TestDNS_ConnectServiceLookup
=== RUN   TestDNS_ExternalServiceLookup
=== PAUSE TestDNS_ExternalServiceLookup
=== RUN   TestDNS_InifiniteRecursion
=== PAUSE TestDNS_InifiniteRecursion
=== RUN   TestDNS_ExternalServiceToConsulCNAMELookup
=== PAUSE TestDNS_ExternalServiceToConsulCNAMELookup
=== RUN   TestDNS_NSRecords
--- SKIP: TestDNS_NSRecords (0.00s)
    dns_test.go:1860: DM-skipped
=== RUN   TestDNS_NSRecords_IPV6
=== PAUSE TestDNS_NSRecords_IPV6
=== RUN   TestDNS_ExternalServiceToConsulCNAMENestedLookup
=== PAUSE TestDNS_ExternalServiceToConsulCNAMENestedLookup
=== RUN   TestDNS_ServiceLookup_ServiceAddress_A
=== PAUSE TestDNS_ServiceLookup_ServiceAddress_A
=== RUN   TestDNS_ServiceLookup_ServiceAddress_CNAME
=== PAUSE TestDNS_ServiceLookup_ServiceAddress_CNAME
=== RUN   TestDNS_ServiceLookup_ServiceAddressIPV6
=== PAUSE TestDNS_ServiceLookup_ServiceAddressIPV6
=== RUN   TestDNS_ServiceLookup_WanAddress
--- SKIP: TestDNS_ServiceLookup_WanAddress (0.00s)
    dns_test.go:2354: DM-skipped
=== RUN   TestDNS_CaseInsensitiveServiceLookup
=== PAUSE TestDNS_CaseInsensitiveServiceLookup
=== RUN   TestDNS_ServiceLookup_TagPeriod
=== PAUSE TestDNS_ServiceLookup_TagPeriod
=== RUN   TestDNS_PreparedQueryNearIPEDNS
=== PAUSE TestDNS_PreparedQueryNearIPEDNS
=== RUN   TestDNS_PreparedQueryNearIP
=== PAUSE TestDNS_PreparedQueryNearIP
=== RUN   TestDNS_ServiceLookup_PreparedQueryNamePeriod
=== PAUSE TestDNS_ServiceLookup_PreparedQueryNamePeriod
=== RUN   TestDNS_ServiceLookup_Dedup
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/30 18:51:48.884529 [ERR] consul: failed to establish leadership: leadership lost while committing log
--- SKIP: TestDNS_ServiceLookup_Dedup (0.00s)
    dns_test.go:3008: DM-skipped
=== RUN   TestDNS_ServiceLookup_Dedup_SRV
=== PAUSE TestDNS_ServiceLookup_Dedup_SRV
=== RUN   TestDNS_Recurse
=== PAUSE TestDNS_Recurse
=== RUN   TestDNS_Recurse_Truncation
=== PAUSE TestDNS_Recurse_Truncation
=== RUN   TestDNS_RecursorTimeout
=== PAUSE TestDNS_RecursorTimeout
=== RUN   TestDNS_ServiceLookup_FilterCritical
=== PAUSE TestDNS_ServiceLookup_FilterCritical
=== RUN   TestDNS_ServiceLookup_OnlyFailing
=== PAUSE TestDNS_ServiceLookup_OnlyFailing
=== RUN   TestDNS_ServiceLookup_OnlyPassing
=== PAUSE TestDNS_ServiceLookup_OnlyPassing
=== RUN   TestDNS_ServiceLookup_Randomize
=== PAUSE TestDNS_ServiceLookup_Randomize
=== RUN   TestBinarySearch
=== PAUSE TestBinarySearch
=== RUN   TestDNS_TCP_and_UDP_Truncate
--- SKIP: TestDNS_TCP_and_UDP_Truncate (0.00s)
    dns_test.go:3903: DM-skipped
=== RUN   TestDNS_ServiceLookup_Truncate
=== PAUSE TestDNS_ServiceLookup_Truncate
=== RUN   TestDNS_ServiceLookup_LargeResponses
=== PAUSE TestDNS_ServiceLookup_LargeResponses
=== RUN   TestDNS_ServiceLookup_ARecordLimits
--- SKIP: TestDNS_ServiceLookup_ARecordLimits (0.00s)
    dns_test.go:4342: DM-skipped
=== RUN   TestDNS_ServiceLookup_AnswerLimits
=== PAUSE TestDNS_ServiceLookup_AnswerLimits
=== RUN   TestDNS_ServiceLookup_CNAME
--- SKIP: TestDNS_ServiceLookup_CNAME (0.00s)
    dns_test.go:4487: DM-skipped
=== RUN   TestDNS_NodeLookup_TTL
=== PAUSE TestDNS_NodeLookup_TTL
=== RUN   TestDNS_ServiceLookup_TTL
=== PAUSE TestDNS_ServiceLookup_TTL
=== RUN   TestDNS_PreparedQuery_TTL
=== PAUSE TestDNS_PreparedQuery_TTL
=== RUN   TestDNS_PreparedQuery_Failover
--- SKIP: TestDNS_PreparedQuery_Failover (0.00s)
    dns_test.go:4909: DM-skipped
=== RUN   TestDNS_ServiceLookup_SRV_RFC
=== PAUSE TestDNS_ServiceLookup_SRV_RFC
=== RUN   TestDNS_ServiceLookup_SRV_RFC_TCP_Default
=== PAUSE TestDNS_ServiceLookup_SRV_RFC_TCP_Default
=== RUN   TestDNS_ServiceLookup_FilterACL
=== PAUSE TestDNS_ServiceLookup_FilterACL
=== RUN   TestDNS_ServiceLookup_MetaTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:49.017050 [WARN] agent: Node name "Node 0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:49.020590 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:49.027828 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6 Address:127.0.0.1:17626}]
2019/12/30 18:51:50 [INFO]  raft: Node at 127.0.0.1:17626 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.383529 [INFO] serf: EventMemberJoin: Node 0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6.dc1 127.0.0.1
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.387258 [INFO] serf: EventMemberJoin: Node 0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6 127.0.0.1
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.388276 [INFO] consul: Adding LAN server Node 0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6 (Addr: tcp/127.0.0.1:17626) (DC: dc1)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.388472 [INFO] consul: Handled member-join event for server "Node 0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6.dc1" in area "wan"
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.389359 [INFO] agent: Started DNS server 127.0.0.1:17621 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.389703 [INFO] agent: Started DNS server 127.0.0.1:17621 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.392064 [INFO] agent: Started HTTP server on 127.0.0.1:17622 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.392195 [INFO] agent: started state syncer
2019/12/30 18:51:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:50 [INFO]  raft: Node at 127.0.0.1:17626 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:50 [INFO]  raft: Node at 127.0.0.1:17626 [Leader] entering Leader state
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.879140 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:50.879657 [INFO] consul: New leader elected: Node 0b5622cc-4ed8-48ff-b7f8-8ca9216f60f6
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.296117 [INFO] agent: Synced node info
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.599079 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.521374ms) from client 127.0.0.1:44996 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.599283 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.599363 [INFO] consul: shutting down server
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.599477 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.728722 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.853759 [INFO] manager: shutting down
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.912035 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.912291 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.912366 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.912999 [INFO] agent: consul server down
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.913179 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.913317 [INFO] agent: Stopping DNS server 127.0.0.1:17621 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.914000 [INFO] agent: Stopping DNS server 127.0.0.1:17621 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.914197 [INFO] agent: Stopping HTTP server 127.0.0.1:17622 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.914504 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_MetaTXT - 2019/12/30 18:51:51.914592 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_MetaTXT (3.02s)
=== RUN   TestDNS_ServiceLookup_SuppressTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:51.981354 [WARN] agent: Node name "Node b3e16324-122e-18b2-466b-c899e995ecd7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:51.981974 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:51.984312 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b3e16324-122e-18b2-466b-c899e995ecd7 Address:127.0.0.1:17632}]
2019/12/30 18:51:52 [INFO]  raft: Node at 127.0.0.1:17632 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.744041 [INFO] serf: EventMemberJoin: Node b3e16324-122e-18b2-466b-c899e995ecd7.dc1 127.0.0.1
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.749319 [INFO] serf: EventMemberJoin: Node b3e16324-122e-18b2-466b-c899e995ecd7 127.0.0.1
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.750806 [INFO] consul: Adding LAN server Node b3e16324-122e-18b2-466b-c899e995ecd7 (Addr: tcp/127.0.0.1:17632) (DC: dc1)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.751274 [INFO] consul: Handled member-join event for server "Node b3e16324-122e-18b2-466b-c899e995ecd7.dc1" in area "wan"
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.753003 [INFO] agent: Started DNS server 127.0.0.1:17627 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.753526 [INFO] agent: Started DNS server 127.0.0.1:17627 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.756354 [INFO] agent: Started HTTP server on 127.0.0.1:17628 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:52.756460 [INFO] agent: started state syncer
2019/12/30 18:51:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:52 [INFO]  raft: Node at 127.0.0.1:17632 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:53 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:53 [INFO]  raft: Node at 127.0.0.1:17632 [Leader] entering Leader state
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.230128 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.230591 [INFO] consul: New leader elected: Node b3e16324-122e-18b2-466b-c899e995ecd7
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.512904 [INFO] agent: Synced node info
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.513025 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.792200 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 771.687µs) from client 127.0.0.1:32949 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.795401 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.795513 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.795574 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:53.912727 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.045472 [INFO] manager: shutting down
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.103873 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104115 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104190 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104254 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104307 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104138 [INFO] agent: consul server down
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104476 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104531 [INFO] agent: Stopping DNS server 127.0.0.1:17627 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104682 [INFO] agent: Stopping DNS server 127.0.0.1:17627 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.104839 [INFO] agent: Stopping HTTP server 127.0.0.1:17628 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.105044 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SuppressTXT - 2019/12/30 18:51:54.105119 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SuppressTXT (2.19s)
=== RUN   TestDNS_AddressLookup
=== PAUSE TestDNS_AddressLookup
=== RUN   TestDNS_AddressLookupIPV6
--- SKIP: TestDNS_AddressLookupIPV6 (0.00s)
    dns_test.go:5350: DM-skipped
=== RUN   TestDNS_NonExistingLookup
=== PAUSE TestDNS_NonExistingLookup
=== RUN   TestDNS_NonExistingLookupEmptyAorAAAA
=== PAUSE TestDNS_NonExistingLookupEmptyAorAAAA
=== RUN   TestDNS_AltDomains_Service
=== PAUSE TestDNS_AltDomains_Service
=== RUN   TestDNS_AltDomains_SOA
=== PAUSE TestDNS_AltDomains_SOA
=== RUN   TestDNS_AltDomains_Overlap
=== PAUSE TestDNS_AltDomains_Overlap
=== RUN   TestDNS_PreparedQuery_AllowStale
=== PAUSE TestDNS_PreparedQuery_AllowStale
=== RUN   TestDNS_InvalidQueries
=== PAUSE TestDNS_InvalidQueries
=== RUN   TestDNS_PreparedQuery_AgentSource
=== PAUSE TestDNS_PreparedQuery_AgentSource
=== RUN   TestDNS_trimUDPResponse_NoTrim
=== PAUSE TestDNS_trimUDPResponse_NoTrim
=== RUN   TestDNS_trimUDPResponse_TrimLimit
=== PAUSE TestDNS_trimUDPResponse_TrimLimit
=== RUN   TestDNS_trimUDPResponse_TrimSize
=== PAUSE TestDNS_trimUDPResponse_TrimSize
=== RUN   TestDNS_trimUDPResponse_TrimSizeEDNS
=== PAUSE TestDNS_trimUDPResponse_TrimSizeEDNS
=== RUN   TestDNS_syncExtra
=== PAUSE TestDNS_syncExtra
=== RUN   TestDNS_Compression_trimUDPResponse
=== PAUSE TestDNS_Compression_trimUDPResponse
=== RUN   TestDNS_Compression_Query
=== PAUSE TestDNS_Compression_Query
=== RUN   TestDNS_Compression_ReverseLookup
=== PAUSE TestDNS_Compression_ReverseLookup
=== RUN   TestDNS_Compression_Recurse
=== PAUSE TestDNS_Compression_Recurse
=== RUN   TestDNSInvalidRegex
=== RUN   TestDNSInvalidRegex/Valid_Hostname
=== RUN   TestDNSInvalidRegex/Valid_Hostname#01
=== RUN   TestDNSInvalidRegex/Invalid_Hostname_with_special_chars
=== RUN   TestDNSInvalidRegex/Invalid_Hostname_with_special_chars_in_the_end
=== RUN   TestDNSInvalidRegex/Whitespace
=== RUN   TestDNSInvalidRegex/Only_special_chars
--- PASS: TestDNSInvalidRegex (0.00s)
    --- PASS: TestDNSInvalidRegex/Valid_Hostname (0.00s)
    --- PASS: TestDNSInvalidRegex/Valid_Hostname#01 (0.00s)
    --- PASS: TestDNSInvalidRegex/Invalid_Hostname_with_special_chars (0.00s)
    --- PASS: TestDNSInvalidRegex/Invalid_Hostname_with_special_chars_in_the_end (0.00s)
    --- PASS: TestDNSInvalidRegex/Whitespace (0.00s)
    --- PASS: TestDNSInvalidRegex/Only_special_chars (0.00s)
=== RUN   TestDNS_formatNodeRecord
--- PASS: TestDNS_formatNodeRecord (0.00s)
=== RUN   TestDNS_ConfigReload
=== PAUSE TestDNS_ConfigReload
=== RUN   TestDNS_ReloadConfig_DuringQuery
=== PAUSE TestDNS_ReloadConfig_DuringQuery
=== RUN   TestEventFire
=== PAUSE TestEventFire
=== RUN   TestEventFire_token
=== PAUSE TestEventFire_token
=== RUN   TestEventList
=== PAUSE TestEventList
=== RUN   TestEventList_Filter
=== PAUSE TestEventList_Filter
=== RUN   TestEventList_ACLFilter
=== PAUSE TestEventList_ACLFilter
=== RUN   TestEventList_Blocking
=== PAUSE TestEventList_Blocking
=== RUN   TestEventList_EventBufOrder
=== PAUSE TestEventList_EventBufOrder
=== RUN   TestUUIDToUint64
=== PAUSE TestUUIDToUint64
=== RUN   TestHealthChecksInState
--- SKIP: TestHealthChecksInState (0.00s)
    health_endpoint_test.go:23: DM-skipped
=== RUN   TestHealthChecksInState_NodeMetaFilter
=== PAUSE TestHealthChecksInState_NodeMetaFilter
=== RUN   TestHealthChecksInState_Filter
=== PAUSE TestHealthChecksInState_Filter
=== RUN   TestHealthChecksInState_DistanceSort
=== PAUSE TestHealthChecksInState_DistanceSort
=== RUN   TestHealthNodeChecks
=== PAUSE TestHealthNodeChecks
=== RUN   TestHealthNodeChecks_Filtering
=== PAUSE TestHealthNodeChecks_Filtering
=== RUN   TestHealthServiceChecks
=== PAUSE TestHealthServiceChecks
=== RUN   TestHealthServiceChecks_NodeMetaFilter
=== PAUSE TestHealthServiceChecks_NodeMetaFilter
=== RUN   TestHealthServiceChecks_Filtering
=== PAUSE TestHealthServiceChecks_Filtering
=== RUN   TestHealthServiceChecks_DistanceSort
=== PAUSE TestHealthServiceChecks_DistanceSort
=== RUN   TestHealthServiceNodes
=== PAUSE TestHealthServiceNodes
=== RUN   TestHealthServiceNodes_NodeMetaFilter
=== PAUSE TestHealthServiceNodes_NodeMetaFilter
=== RUN   TestHealthServiceNodes_Filter
--- SKIP: TestHealthServiceNodes_Filter (0.00s)
    health_endpoint_test.go:737: DM-skipped
=== RUN   TestHealthServiceNodes_DistanceSort
=== PAUSE TestHealthServiceNodes_DistanceSort
=== RUN   TestHealthServiceNodes_PassingFilter
--- SKIP: TestHealthServiceNodes_PassingFilter (0.00s)
    health_endpoint_test.go:879: DM-skipped
=== RUN   TestHealthServiceNodes_WanTranslation
=== PAUSE TestHealthServiceNodes_WanTranslation
=== RUN   TestHealthConnectServiceNodes
=== PAUSE TestHealthConnectServiceNodes
=== RUN   TestHealthConnectServiceNodes_Filter
=== PAUSE TestHealthConnectServiceNodes_Filter
=== RUN   TestHealthConnectServiceNodes_PassingFilter
=== PAUSE TestHealthConnectServiceNodes_PassingFilter
=== RUN   TestFilterNonPassing
=== PAUSE TestFilterNonPassing
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:54.248113 [WARN] agent: Node name "Node a465168d-d9e1-e400-3047-69dddd922f5b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:54.248667 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:54.251713 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a465168d-d9e1-e400-3047-69dddd922f5b Address:127.0.0.1:17638}]
2019/12/30 18:51:55 [INFO]  raft: Node at 127.0.0.1:17638 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.677310 [INFO] serf: EventMemberJoin: Node a465168d-d9e1-e400-3047-69dddd922f5b.dc1 127.0.0.1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.683892 [INFO] serf: EventMemberJoin: Node a465168d-d9e1-e400-3047-69dddd922f5b 127.0.0.1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.685946 [INFO] consul: Adding LAN server Node a465168d-d9e1-e400-3047-69dddd922f5b (Addr: tcp/127.0.0.1:17638) (DC: dc1)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.686903 [INFO] consul: Handled member-join event for server "Node a465168d-d9e1-e400-3047-69dddd922f5b.dc1" in area "wan"
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.689336 [INFO] agent: Started DNS server 127.0.0.1:17633 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.690001 [INFO] agent: Started DNS server 127.0.0.1:17633 (udp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.692498 [INFO] agent: Started HTTP server on 127.0.0.1:17634 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:55.692606 [INFO] agent: started state syncer
2019/12/30 18:51:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:55 [INFO]  raft: Node at 127.0.0.1:17638 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:56 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:56 [INFO]  raft: Node at 127.0.0.1:17638 [Leader] entering Leader state
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:56.757533 [INFO] consul: cluster leadership acquired
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:56.758028 [INFO] consul: New leader elected: Node a465168d-d9e1-e400-3047-69dddd922f5b
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:56.945248 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:57.239585 [INFO] acl: initializing acls
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:57.248740 [INFO] acl: initializing acls
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:57.831385 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:57.831488 [WARN] consul: Configuring a non-UUID master token is deprecated
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:57.832467 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:57.832540 [WARN] consul: Configuring a non-UUID master token is deprecated
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.208166 [INFO] consul: Bootstrapped ACL master token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.454875 [INFO] consul: Bootstrapped ACL master token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.613468 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.613601 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.615625 [INFO] serf: EventMemberUpdate: Node a465168d-d9e1-e400-3047-69dddd922f5b
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.616756 [INFO] serf: EventMemberUpdate: Node a465168d-d9e1-e400-3047-69dddd922f5b.dc1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.869639 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.870607 [INFO] serf: EventMemberUpdate: Node a465168d-d9e1-e400-3047-69dddd922f5b
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:51:58.871292 [INFO] serf: EventMemberUpdate: Node a465168d-d9e1-e400-3047-69dddd922f5b.dc1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:00.913223 [INFO] agent: Synced node info
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:00.913359 [DEBUG] agent: Node info in sync
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:01.270883 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:01.638786 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:01.639838 [DEBUG] consul: Skipping self join check for "Node a465168d-d9e1-e400-3047-69dddd922f5b" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:01.640025 [INFO] consul: member 'Node a465168d-d9e1-e400-3047-69dddd922f5b' joined, marking health alive
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.022611 [DEBUG] consul: Skipping self join check for "Node a465168d-d9e1-e400-3047-69dddd922f5b" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.023093 [DEBUG] consul: Skipping self join check for "Node a465168d-d9e1-e400-3047-69dddd922f5b" since the cluster is too small
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.041725 [ERR] http: Request GET /v1/query/, error: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:52350
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.042916 [DEBUG] http: Request GET /v1/query/ (1.61971ms) from=127.0.0.1:52350
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.049214 [ERR] http: Request PUT /v1/query/, error: Prepared Query lookup failed: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:52352
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.050509 [DEBUG] http: Request PUT /v1/query/ (1.873383ms) from=127.0.0.1:52352
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.053662 [ERR] http: Request POST /v1/query/, error: method POST not allowed from=127.0.0.1:52354
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.054284 [DEBUG] http: Request POST /v1/query/ (633.35µs) from=127.0.0.1:52354
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.058400 [ERR] http: Request DELETE /v1/query/, error: Prepared Query lookup failed: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:52356
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.059794 [DEBUG] http: Request DELETE /v1/query/ (2.070721ms) from=127.0.0.1:52356
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.063323 [ERR] http: Request HEAD /v1/query/, error: method HEAD not allowed from=127.0.0.1:52358
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.063469 [DEBUG] http: Request HEAD /v1/query/ (172.004µs) from=127.0.0.1:52358
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.070138 [DEBUG] http: Request OPTIONS /v1/query/ (2.88141ms) from=127.0.0.1:52358
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.073820 [DEBUG] http: Request GET /v1/query/xxx/execute (829.022µs) from=127.0.0.1:52360
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.076431 [ERR] http: Request PUT /v1/query/xxx/execute, error: method PUT not allowed from=127.0.0.1:52362
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.076938 [DEBUG] http: Request PUT /v1/query/xxx/execute (512.681µs) from=127.0.0.1:52362
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.079733 [ERR] http: Request POST /v1/query/xxx/execute, error: method POST not allowed from=127.0.0.1:52364
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.080370 [DEBUG] http: Request POST /v1/query/xxx/execute (613.017µs) from=127.0.0.1:52364
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.085930 [ERR] http: Request DELETE /v1/query/xxx/execute, error: method DELETE not allowed from=127.0.0.1:52366
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.087403 [DEBUG] http: Request DELETE /v1/query/xxx/execute (3.577762ms) from=127.0.0.1:52366
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.117220 [ERR] http: Request HEAD /v1/query/xxx/execute, error: method HEAD not allowed from=127.0.0.1:52368
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.117512 [DEBUG] http: Request HEAD /v1/query/xxx/execute (366.343µs) from=127.0.0.1:52368
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.120774 [DEBUG] http: Request OPTIONS /v1/query/xxx/execute (1.361036ms) from=127.0.0.1:52368
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.125789 [DEBUG] http: Request GET /v1/query/xxx/explain (1.00136ms) from=127.0.0.1:52370
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.174900 [ERR] http: Request PUT /v1/query/xxx/explain, error: method PUT not allowed from=127.0.0.1:52372
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.176116 [DEBUG] http: Request PUT /v1/query/xxx/explain (1.301702ms) from=127.0.0.1:52372
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.187475 [ERR] http: Request POST /v1/query/xxx/explain, error: method POST not allowed from=127.0.0.1:52374
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.188567 [DEBUG] http: Request POST /v1/query/xxx/explain (1.099696ms) from=127.0.0.1:52374
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.192026 [ERR] http: Request DELETE /v1/query/xxx/explain, error: method DELETE not allowed from=127.0.0.1:52376
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.192754 [DEBUG] http: Request DELETE /v1/query/xxx/explain (732.686µs) from=127.0.0.1:52376
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.202113 [ERR] http: Request HEAD /v1/query/xxx/explain, error: method HEAD not allowed from=127.0.0.1:52378
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.202522 [DEBUG] http: Request HEAD /v1/query/xxx/explain (419.345µs) from=127.0.0.1:52378
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.209475 [DEBUG] http: Request OPTIONS /v1/query/xxx/explain (1.027694ms) from=127.0.0.1:52378
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.215352 [DEBUG] http: Request GET /v1/query (2.583068ms) from=127.0.0.1:52380
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.218511 [ERR] http: Request PUT /v1/query, error: method PUT not allowed from=127.0.0.1:52382
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.219167 [DEBUG] http: Request PUT /v1/query (659.684µs) from=127.0.0.1:52382
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.222867 [DEBUG] http: Request POST /v1/query (554.015µs) from=127.0.0.1:52384
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.228317 [ERR] http: Request DELETE /v1/query, error: method DELETE not allowed from=127.0.0.1:52386
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.228973 [DEBUG] http: Request DELETE /v1/query (640.684µs) from=127.0.0.1:52386
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.232207 [ERR] http: Request HEAD /v1/query, error: method HEAD not allowed from=127.0.0.1:52388
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.232357 [DEBUG] http: Request HEAD /v1/query (173.672µs) from=127.0.0.1:52388
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.233727 [DEBUG] http: Request OPTIONS /v1/query (13µs) from=127.0.0.1:52388
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.235326 [ERR] http: Request GET /v1/acl/auth-method/, error: Bad request: Missing auth method name from=127.0.0.1:52388
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.235851 [DEBUG] http: Request GET /v1/acl/auth-method/ (547.348µs) from=127.0.0.1:52388
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.238936 [ERR] http: Request PUT /v1/acl/auth-method/, error: Bad request: AuthMethod decoding failed: EOF from=127.0.0.1:52390
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.239646 [DEBUG] http: Request PUT /v1/acl/auth-method/ (744.353µs) from=127.0.0.1:52390
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.242467 [ERR] http: Request POST /v1/acl/auth-method/, error: method POST not allowed from=127.0.0.1:52392
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.243065 [DEBUG] http: Request POST /v1/acl/auth-method/ (598.683µs) from=127.0.0.1:52392
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.245900 [ERR] http: Request DELETE /v1/acl/auth-method/, error: Bad request: Missing auth method name from=127.0.0.1:52394
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.246477 [DEBUG] http: Request DELETE /v1/acl/auth-method/ (584.016µs) from=127.0.0.1:52394
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.249462 [ERR] http: Request HEAD /v1/acl/auth-method/, error: method HEAD not allowed from=127.0.0.1:52396
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.249616 [DEBUG] http: Request HEAD /v1/acl/auth-method/ (217.005µs) from=127.0.0.1:52396
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.251433 [DEBUG] http: Request OPTIONS /v1/acl/auth-method/ (15.667µs) from=127.0.0.1:52396
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.253461 [ERR] http: Request GET /v1/coordinate/node/, error: Permission denied from=127.0.0.1:52396
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.254059 [DEBUG] http: Request GET /v1/coordinate/node/ (1.13803ms) from=127.0.0.1:52396
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.259840 [ERR] http: Request PUT /v1/coordinate/node/, error: method PUT not allowed from=127.0.0.1:52398
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.260488 [DEBUG] http: Request PUT /v1/coordinate/node/ (645.684µs) from=127.0.0.1:52398
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.263337 [ERR] http: Request POST /v1/coordinate/node/, error: method POST not allowed from=127.0.0.1:52400
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.264713 [DEBUG] http: Request POST /v1/coordinate/node/ (1.38437ms) from=127.0.0.1:52400
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.282511 [ERR] http: Request DELETE /v1/coordinate/node/, error: method DELETE not allowed from=127.0.0.1:52402
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.287717 [DEBUG] http: Request DELETE /v1/coordinate/node/ (5.195805ms) from=127.0.0.1:52402
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.295038 [ERR] http: Request HEAD /v1/coordinate/node/, error: method HEAD not allowed from=127.0.0.1:52404
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.295188 [DEBUG] http: Request HEAD /v1/coordinate/node/ (170.338µs) from=127.0.0.1:52404
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.301394 [DEBUG] http: Request OPTIONS /v1/coordinate/node/ (19.667µs) from=127.0.0.1:52404
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.302914 [ERR] http: Request GET /v1/session/destroy/, error: method GET not allowed from=127.0.0.1:52404
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.303557 [DEBUG] http: Request GET /v1/session/destroy/ (621.017µs) from=127.0.0.1:52404
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.307831 [DEBUG] http: Request PUT /v1/session/destroy/ (511.014µs) from=127.0.0.1:52406
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.310864 [ERR] http: Request POST /v1/session/destroy/, error: method POST not allowed from=127.0.0.1:52408
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.311596 [DEBUG] http: Request POST /v1/session/destroy/ (725.019µs) from=127.0.0.1:52408
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.314701 [ERR] http: Request DELETE /v1/session/destroy/, error: method DELETE not allowed from=127.0.0.1:52410
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.315296 [DEBUG] http: Request DELETE /v1/session/destroy/ (593.682µs) from=127.0.0.1:52410
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.318524 [ERR] http: Request HEAD /v1/session/destroy/, error: method HEAD not allowed from=127.0.0.1:52412
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.318680 [DEBUG] http: Request HEAD /v1/session/destroy/ (178.338µs) from=127.0.0.1:52412
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.321831 [DEBUG] http: Request OPTIONS /v1/session/destroy/ (15.001µs) from=127.0.0.1:52412
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.323327 [ERR] http: Request GET /v1/acl/clone/, error: method GET not allowed from=127.0.0.1:52412
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.323877 [DEBUG] http: Request GET /v1/acl/clone/ (550.681µs) from=127.0.0.1:52412
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.327115 [DEBUG] http: Request PUT /v1/acl/clone/ (471.345µs) from=127.0.0.1:52414
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.330170 [ERR] http: Request POST /v1/acl/clone/, error: method POST not allowed from=127.0.0.1:52416
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.330808 [DEBUG] http: Request POST /v1/acl/clone/ (641.35µs) from=127.0.0.1:52416
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.333813 [ERR] http: Request DELETE /v1/acl/clone/, error: method DELETE not allowed from=127.0.0.1:52418
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.334608 [DEBUG] http: Request DELETE /v1/acl/clone/ (771.687µs) from=127.0.0.1:52418
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.337441 [ERR] http: Request HEAD /v1/acl/clone/, error: method HEAD not allowed from=127.0.0.1:52420
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.337580 [DEBUG] http: Request HEAD /v1/acl/clone/ (164.671µs) from=127.0.0.1:52420
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.339042 [DEBUG] http: Request OPTIONS /v1/acl/clone/ (18.334µs) from=127.0.0.1:52420
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.343187 [ERR] http: Request GET /v1/agent/health/service/id/, error: Bad request: Missing serviceID from=127.0.0.1:52420
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.343729 [DEBUG] http: Request GET /v1/agent/health/service/id/ (553.681µs) from=127.0.0.1:52420
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.346426 [ERR] http: Request PUT /v1/agent/health/service/id/, error: method PUT not allowed from=127.0.0.1:52422
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.347075 [DEBUG] http: Request PUT /v1/agent/health/service/id/ (660.017µs) from=127.0.0.1:52422
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.353126 [ERR] http: Request POST /v1/agent/health/service/id/, error: method POST not allowed from=127.0.0.1:52424
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.353660 [DEBUG] http: Request POST /v1/agent/health/service/id/ (550.681µs) from=127.0.0.1:52424
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.356796 [ERR] http: Request DELETE /v1/agent/health/service/id/, error: method DELETE not allowed from=127.0.0.1:52426
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.359544 [DEBUG] http: Request DELETE /v1/agent/health/service/id/ (2.74974ms) from=127.0.0.1:52426
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.373544 [ERR] http: Request HEAD /v1/agent/health/service/id/, error: method HEAD not allowed from=127.0.0.1:52428
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.373688 [DEBUG] http: Request HEAD /v1/agent/health/service/id/ (176.004µs) from=127.0.0.1:52428
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.388929 [DEBUG] http: Request OPTIONS /v1/agent/health/service/id/ (17.667µs) from=127.0.0.1:52428
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.392418 [ERR] http: Request GET /v1/agent/check/pass/, error: method GET not allowed from=127.0.0.1:52428
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.392963 [DEBUG] http: Request GET /v1/agent/check/pass/ (551.681µs) from=127.0.0.1:52428
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.396563 [ERR] http: Request PUT /v1/agent/check/pass/, error: Unknown check "" from=127.0.0.1:52430
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.397190 [DEBUG] http: Request PUT /v1/agent/check/pass/ (774.02µs) from=127.0.0.1:52430
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.400461 [ERR] http: Request POST /v1/agent/check/pass/, error: method POST not allowed from=127.0.0.1:52432
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.401049 [DEBUG] http: Request POST /v1/agent/check/pass/ (601.35µs) from=127.0.0.1:52432
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.405008 [ERR] http: Request DELETE /v1/agent/check/pass/, error: method DELETE not allowed from=127.0.0.1:52434
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.405590 [DEBUG] http: Request DELETE /v1/agent/check/pass/ (584.682µs) from=127.0.0.1:52434
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.408281 [ERR] http: Request HEAD /v1/agent/check/pass/, error: method HEAD not allowed from=127.0.0.1:52436
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.408433 [DEBUG] http: Request HEAD /v1/agent/check/pass/ (175.005µs) from=127.0.0.1:52436
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.409921 [DEBUG] http: Request OPTIONS /v1/agent/check/pass/ (18µs) from=127.0.0.1:52436
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.411270 [ERR] http: Request GET /v1/agent/connect/authorize, error: method GET not allowed from=127.0.0.1:52436
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.411890 [DEBUG] http: Request GET /v1/agent/connect/authorize (624.35µs) from=127.0.0.1:52436
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.414876 [ERR] http: Request PUT /v1/agent/connect/authorize, error: method PUT not allowed from=127.0.0.1:52438
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.415673 [DEBUG] http: Request PUT /v1/agent/connect/authorize (805.355µs) from=127.0.0.1:52438
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.418755 [ERR] http: Request POST /v1/agent/connect/authorize, error: Bad request: Request decode failed: EOF from=127.0.0.1:52440
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.419627 [DEBUG] http: Request POST /v1/agent/connect/authorize (938.025µs) from=127.0.0.1:52440
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.422304 [ERR] http: Request DELETE /v1/agent/connect/authorize, error: method DELETE not allowed from=127.0.0.1:52442
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.423079 [DEBUG] http: Request DELETE /v1/agent/connect/authorize (769.02µs) from=127.0.0.1:52442
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.426154 [ERR] http: Request HEAD /v1/agent/connect/authorize, error: method HEAD not allowed from=127.0.0.1:52444
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.426294 [DEBUG] http: Request HEAD /v1/agent/connect/authorize (166.338µs) from=127.0.0.1:52444
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.428308 [DEBUG] http: Request OPTIONS /v1/agent/connect/authorize (16.667µs) from=127.0.0.1:52444
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.429976 [ERR] http: Request GET /v1/agent/service/register, error: method GET not allowed from=127.0.0.1:52444
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.430543 [DEBUG] http: Request GET /v1/agent/service/register (569.015µs) from=127.0.0.1:52444
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.434132 [DEBUG] http: Request PUT /v1/agent/service/register (553.015µs) from=127.0.0.1:52446
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.437678 [ERR] http: Request POST /v1/agent/service/register, error: method POST not allowed from=127.0.0.1:52448
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.438359 [DEBUG] http: Request POST /v1/agent/service/register (665.351µs) from=127.0.0.1:52448
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.449955 [ERR] http: Request DELETE /v1/agent/service/register, error: method DELETE not allowed from=127.0.0.1:52450
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.450590 [DEBUG] http: Request DELETE /v1/agent/service/register (647.351µs) from=127.0.0.1:52450
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.453357 [ERR] http: Request HEAD /v1/agent/service/register, error: method HEAD not allowed from=127.0.0.1:52452
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.453491 [DEBUG] http: Request HEAD /v1/agent/service/register (154.004µs) from=127.0.0.1:52452
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.455088 [DEBUG] http: Request OPTIONS /v1/agent/service/register (15.667µs) from=127.0.0.1:52452
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.456474 [ERR] http: Request GET /v1/config, error: method GET not allowed from=127.0.0.1:52452
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.457068 [DEBUG] http: Request GET /v1/config (589.683µs) from=127.0.0.1:52452
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.460021 [ERR] http: Request PUT /v1/config, error: Bad request: Request decoding failed: EOF from=127.0.0.1:52454
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.460709 [DEBUG] http: Request PUT /v1/config (716.353µs) from=127.0.0.1:52454
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.470711 [ERR] http: Request POST /v1/config, error: method POST not allowed from=127.0.0.1:52456
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.471678 [DEBUG] http: Request POST /v1/config (945.025µs) from=127.0.0.1:52456
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.475009 [ERR] http: Request DELETE /v1/config, error: method DELETE not allowed from=127.0.0.1:52458
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.475844 [DEBUG] http: Request DELETE /v1/config (818.355µs) from=127.0.0.1:52458
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.479605 [ERR] http: Request HEAD /v1/config, error: method HEAD not allowed from=127.0.0.1:52460
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.479828 [DEBUG] http: Request HEAD /v1/config (285.008µs) from=127.0.0.1:52460
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.481432 [DEBUG] http: Request OPTIONS /v1/config (18.667µs) from=127.0.0.1:52460
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.483539 [DEBUG] http: Request GET /v1/health/service/ (642.684µs) from=127.0.0.1:52460
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.486441 [ERR] http: Request PUT /v1/health/service/, error: method PUT not allowed from=127.0.0.1:52462
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.487099 [DEBUG] http: Request PUT /v1/health/service/ (689.685µs) from=127.0.0.1:52462
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.496872 [ERR] http: Request POST /v1/health/service/, error: method POST not allowed from=127.0.0.1:52464
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.497492 [DEBUG] http: Request POST /v1/health/service/ (628.683µs) from=127.0.0.1:52464
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.500627 [ERR] http: Request DELETE /v1/health/service/, error: method DELETE not allowed from=127.0.0.1:52466
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.501655 [DEBUG] http: Request DELETE /v1/health/service/ (1.021694ms) from=127.0.0.1:52466
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.505478 [ERR] http: Request HEAD /v1/health/service/, error: method HEAD not allowed from=127.0.0.1:52468
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.505627 [DEBUG] http: Request HEAD /v1/health/service/ (184.672µs) from=127.0.0.1:52468
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.507119 [DEBUG] http: Request OPTIONS /v1/health/service/ (16.667µs) from=127.0.0.1:52468
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.509495 [DEBUG] http: Request GET /v1/agent/service/ (704.019µs) from=127.0.0.1:52468
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.514088 [ERR] http: Request PUT /v1/agent/service/, error: method PUT not allowed from=127.0.0.1:52470
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.514718 [DEBUG] http: Request PUT /v1/agent/service/ (609.683µs) from=127.0.0.1:52470
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.517948 [ERR] http: Request POST /v1/agent/service/, error: method POST not allowed from=127.0.0.1:52472
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.518530 [DEBUG] http: Request POST /v1/agent/service/ (581.682µs) from=127.0.0.1:52472
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.522184 [ERR] http: Request DELETE /v1/agent/service/, error: method DELETE not allowed from=127.0.0.1:52474
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.523113 [DEBUG] http: Request DELETE /v1/agent/service/ (905.357µs) from=127.0.0.1:52474
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.526336 [ERR] http: Request HEAD /v1/agent/service/, error: method HEAD not allowed from=127.0.0.1:52476
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.526607 [DEBUG] http: Request HEAD /v1/agent/service/ (286.674µs) from=127.0.0.1:52476
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.528170 [DEBUG] http: Request OPTIONS /v1/agent/service/ (16.667µs) from=127.0.0.1:52476
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.529944 [ERR] http: Request GET /v1/operator/raft/peer, error: method GET not allowed from=127.0.0.1:52476
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.530746 [DEBUG] http: Request GET /v1/operator/raft/peer (863.356µs) from=127.0.0.1:52476
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.536112 [ERR] http: Request PUT /v1/operator/raft/peer, error: method PUT not allowed from=127.0.0.1:52478
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.536855 [DEBUG] http: Request PUT /v1/operator/raft/peer (797.688µs) from=127.0.0.1:52478
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.542972 [ERR] http: Request POST /v1/operator/raft/peer, error: method POST not allowed from=127.0.0.1:52480
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.543596 [DEBUG] http: Request POST /v1/operator/raft/peer (624.683µs) from=127.0.0.1:52480
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.547473 [DEBUG] http: Request DELETE /v1/operator/raft/peer (620.017µs) from=127.0.0.1:52482
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.556707 [ERR] http: Request HEAD /v1/operator/raft/peer, error: method HEAD not allowed from=127.0.0.1:52484
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.556845 [DEBUG] http: Request HEAD /v1/operator/raft/peer (157.671µs) from=127.0.0.1:52484
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.558294 [DEBUG] http: Request OPTIONS /v1/operator/raft/peer (15.667µs) from=127.0.0.1:52484
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.561121 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.379037ms) from=127.0.0.1:52484
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.564860 [ERR] http: Request PUT /v1/coordinate/datacenters, error: method PUT not allowed from=127.0.0.1:52486
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.565527 [DEBUG] http: Request PUT /v1/coordinate/datacenters (672.018µs) from=127.0.0.1:52486
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.569089 [ERR] http: Request POST /v1/coordinate/datacenters, error: method POST not allowed from=127.0.0.1:52488
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.569985 [DEBUG] http: Request POST /v1/coordinate/datacenters (895.357µs) from=127.0.0.1:52488
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.573544 [ERR] http: Request DELETE /v1/coordinate/datacenters, error: method DELETE not allowed from=127.0.0.1:52490
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.574241 [DEBUG] http: Request DELETE /v1/coordinate/datacenters (691.352µs) from=127.0.0.1:52490
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.578480 [ERR] http: Request HEAD /v1/coordinate/datacenters, error: method HEAD not allowed from=127.0.0.1:52492
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.578617 [DEBUG] http: Request HEAD /v1/coordinate/datacenters (157.338µs) from=127.0.0.1:52492
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.580144 [DEBUG] http: Request OPTIONS /v1/coordinate/datacenters (17µs) from=127.0.0.1:52492
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.582290 [DEBUG] http: Request GET /v1/internal/ui/node/ (558.681µs) from=127.0.0.1:52492
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.586037 [ERR] http: Request PUT /v1/internal/ui/node/, error: method PUT not allowed from=127.0.0.1:52494
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.586689 [DEBUG] http: Request PUT /v1/internal/ui/node/ (658.685µs) from=127.0.0.1:52494
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.589719 [ERR] http: Request POST /v1/internal/ui/node/, error: method POST not allowed from=127.0.0.1:52496
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.590607 [DEBUG] http: Request POST /v1/internal/ui/node/ (873.356µs) from=127.0.0.1:52496
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.593960 [ERR] http: Request DELETE /v1/internal/ui/node/, error: method DELETE not allowed from=127.0.0.1:52498
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.594683 [DEBUG] http: Request DELETE /v1/internal/ui/node/ (726.686µs) from=127.0.0.1:52498
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.597834 [ERR] http: Request HEAD /v1/internal/ui/node/, error: method HEAD not allowed from=127.0.0.1:52500
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.597972 [DEBUG] http: Request HEAD /v1/internal/ui/node/ (156.671µs) from=127.0.0.1:52500
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.599457 [DEBUG] http: Request OPTIONS /v1/internal/ui/node/ (15.333µs) from=127.0.0.1:52500
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.601890 [DEBUG] http: Request GET /v1/session/node/ (618.683µs) from=127.0.0.1:52500
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.606358 [ERR] http: Request PUT /v1/session/node/, error: method PUT not allowed from=127.0.0.1:52502
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.607167 [DEBUG] http: Request PUT /v1/session/node/ (800.354µs) from=127.0.0.1:52502
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.610492 [ERR] http: Request POST /v1/session/node/, error: method POST not allowed from=127.0.0.1:52504
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.611310 [DEBUG] http: Request POST /v1/session/node/ (819.688µs) from=127.0.0.1:52504
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.615838 [ERR] http: Request DELETE /v1/session/node/, error: method DELETE not allowed from=127.0.0.1:52506
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.616558 [DEBUG] http: Request DELETE /v1/session/node/ (710.352µs) from=127.0.0.1:52506
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.619507 [ERR] http: Request HEAD /v1/session/node/, error: method HEAD not allowed from=127.0.0.1:52508
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.619650 [DEBUG] http: Request HEAD /v1/session/node/ (164.671µs) from=127.0.0.1:52508
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.621093 [DEBUG] http: Request OPTIONS /v1/session/node/ (17.001µs) from=127.0.0.1:52508
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.622548 [ERR] http: Request GET /v1/agent/join/, error: method GET not allowed from=127.0.0.1:52508
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.623158 [DEBUG] http: Request GET /v1/agent/join/ (610.349µs) from=127.0.0.1:52508
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.626334 [ERR] http: Request PUT /v1/agent/join/, error: Permission denied from=127.0.0.1:52510
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.626983 [DEBUG] http: Request PUT /v1/agent/join/ (794.354µs) from=127.0.0.1:52510
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.630254 [ERR] http: Request POST /v1/agent/join/, error: method POST not allowed from=127.0.0.1:52512
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.630927 [DEBUG] http: Request POST /v1/agent/join/ (675.351µs) from=127.0.0.1:52512
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.634167 [ERR] http: Request DELETE /v1/agent/join/, error: method DELETE not allowed from=127.0.0.1:52514
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.634972 [DEBUG] http: Request DELETE /v1/agent/join/ (813.021µs) from=127.0.0.1:52514
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.638101 [ERR] http: Request HEAD /v1/agent/join/, error: method HEAD not allowed from=127.0.0.1:52516
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.638247 [DEBUG] http: Request HEAD /v1/agent/join/ (171.004µs) from=127.0.0.1:52516
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.639966 [DEBUG] http: Request OPTIONS /v1/agent/join/ (18.333µs) from=127.0.0.1:52516
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.641723 [ERR] http: Request GET /v1/agent/check/register, error: method GET not allowed from=127.0.0.1:52516
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.642574 [DEBUG] http: Request GET /v1/agent/check/register (877.023µs) from=127.0.0.1:52516
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.647251 [DEBUG] http: Request PUT /v1/agent/check/register (678.018µs) from=127.0.0.1:52518
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.655678 [ERR] http: Request POST /v1/agent/check/register, error: method POST not allowed from=127.0.0.1:52520
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.656404 [DEBUG] http: Request POST /v1/agent/check/register (739.687µs) from=127.0.0.1:52520
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.659756 [ERR] http: Request DELETE /v1/agent/check/register, error: method DELETE not allowed from=127.0.0.1:52522
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.660321 [DEBUG] http: Request DELETE /v1/agent/check/register (576.682µs) from=127.0.0.1:52522
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.663213 [ERR] http: Request HEAD /v1/agent/check/register, error: method HEAD not allowed from=127.0.0.1:52524
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.663352 [DEBUG] http: Request HEAD /v1/agent/check/register (160.337µs) from=127.0.0.1:52524
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.664997 [DEBUG] http: Request OPTIONS /v1/agent/check/register (19.333µs) from=127.0.0.1:52524
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.668021 [DEBUG] http: Request GET /v1/acl/replication (1.261034ms) from=127.0.0.1:52524
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.671459 [ERR] http: Request PUT /v1/acl/replication, error: method PUT not allowed from=127.0.0.1:52526
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.672009 [DEBUG] http: Request PUT /v1/acl/replication (557.348µs) from=127.0.0.1:52526
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.674784 [ERR] http: Request POST /v1/acl/replication, error: method POST not allowed from=127.0.0.1:52528
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.675291 [DEBUG] http: Request POST /v1/acl/replication (515.681µs) from=127.0.0.1:52528
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.679007 [ERR] http: Request DELETE /v1/acl/replication, error: method DELETE not allowed from=127.0.0.1:52530
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.679615 [DEBUG] http: Request DELETE /v1/acl/replication (612.016µs) from=127.0.0.1:52530
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.682387 [ERR] http: Request HEAD /v1/acl/replication, error: method HEAD not allowed from=127.0.0.1:52532
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.682539 [DEBUG] http: Request HEAD /v1/acl/replication (175.005µs) from=127.0.0.1:52532
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.684099 [DEBUG] http: Request OPTIONS /v1/acl/replication (15.668µs) from=127.0.0.1:52532
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.685716 [ERR] http: Request GET /v1/connect/intentions/match, error: required query parameter 'by' not set from=127.0.0.1:52532
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.686362 [DEBUG] http: Request GET /v1/connect/intentions/match (656.684µs) from=127.0.0.1:52532
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.694932 [ERR] http: Request PUT /v1/connect/intentions/match, error: method PUT not allowed from=127.0.0.1:52534
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.695529 [DEBUG] http: Request PUT /v1/connect/intentions/match (608.35µs) from=127.0.0.1:52534
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.705560 [ERR] http: Request POST /v1/connect/intentions/match, error: method POST not allowed from=127.0.0.1:52536
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.706175 [DEBUG] http: Request POST /v1/connect/intentions/match (616.683µs) from=127.0.0.1:52536
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.709021 [ERR] http: Request DELETE /v1/connect/intentions/match, error: method DELETE not allowed from=127.0.0.1:52538
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.709768 [DEBUG] http: Request DELETE /v1/connect/intentions/match (732.019µs) from=127.0.0.1:52538
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.712693 [ERR] http: Request HEAD /v1/connect/intentions/match, error: method HEAD not allowed from=127.0.0.1:52540
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.712840 [DEBUG] http: Request HEAD /v1/connect/intentions/match (172.671µs) from=127.0.0.1:52540
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.714462 [DEBUG] http: Request OPTIONS /v1/connect/intentions/match (14.334µs) from=127.0.0.1:52540
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.716234 [ERR] http: Request GET /v1/session/create, error: method GET not allowed from=127.0.0.1:52540
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.717234 [DEBUG] http: Request GET /v1/session/create (995.693µs) from=127.0.0.1:52540
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.720784 [ERR] http: Request PUT /v1/session/create, error: Permission denied from=127.0.0.1:52542
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.721335 [DEBUG] http: Request PUT /v1/session/create (953.358µs) from=127.0.0.1:52542
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.724200 [ERR] http: Request POST /v1/session/create, error: method POST not allowed from=127.0.0.1:52544
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.724918 [DEBUG] http: Request POST /v1/session/create (746.353µs) from=127.0.0.1:52544
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.728076 [ERR] http: Request DELETE /v1/session/create, error: method DELETE not allowed from=127.0.0.1:52546
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.728799 [DEBUG] http: Request DELETE /v1/session/create (728.353µs) from=127.0.0.1:52546
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.731951 [ERR] http: Request HEAD /v1/session/create, error: method HEAD not allowed from=127.0.0.1:52548
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.732096 [DEBUG] http: Request HEAD /v1/session/create (165.004µs) from=127.0.0.1:52548
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.733569 [DEBUG] http: Request OPTIONS /v1/session/create (14.334µs) from=127.0.0.1:52548
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.735191 [ERR] http: Request GET /v1/coordinate/update, error: method GET not allowed from=127.0.0.1:52548
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.735721 [DEBUG] http: Request GET /v1/coordinate/update (533.348µs) from=127.0.0.1:52548
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.739459 [DEBUG] http: Request PUT /v1/coordinate/update (596.016µs) from=127.0.0.1:52550
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.742510 [ERR] http: Request POST /v1/coordinate/update, error: method POST not allowed from=127.0.0.1:52552
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.743047 [DEBUG] http: Request POST /v1/coordinate/update (540.348µs) from=127.0.0.1:52552
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.745988 [ERR] http: Request DELETE /v1/coordinate/update, error: method DELETE not allowed from=127.0.0.1:52554
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.746636 [DEBUG] http: Request DELETE /v1/coordinate/update (646.017µs) from=127.0.0.1:52554
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.749981 [ERR] http: Request HEAD /v1/coordinate/update, error: method HEAD not allowed from=127.0.0.1:52556
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.750132 [DEBUG] http: Request HEAD /v1/coordinate/update (172.671µs) from=127.0.0.1:52556
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.751637 [DEBUG] http: Request OPTIONS /v1/coordinate/update (13.667µs) from=127.0.0.1:52556
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.754259 [DEBUG] http: Request GET /v1/event/list (1.099696ms) from=127.0.0.1:52556
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.757283 [ERR] http: Request PUT /v1/event/list, error: method PUT not allowed from=127.0.0.1:52558
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.758097 [DEBUG] http: Request PUT /v1/event/list (797.688µs) from=127.0.0.1:52558
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.761356 [ERR] http: Request POST /v1/event/list, error: method POST not allowed from=127.0.0.1:52560
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.761942 [DEBUG] http: Request POST /v1/event/list (638.351µs) from=127.0.0.1:52560
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.764831 [ERR] http: Request DELETE /v1/event/list, error: method DELETE not allowed from=127.0.0.1:52562
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.765495 [DEBUG] http: Request DELETE /v1/event/list (665.018µs) from=127.0.0.1:52562
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.768888 [ERR] http: Request HEAD /v1/event/list, error: method HEAD not allowed from=127.0.0.1:52564
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.769034 [DEBUG] http: Request HEAD /v1/event/list (164.671µs) from=127.0.0.1:52564
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.770673 [DEBUG] http: Request OPTIONS /v1/event/list (16.667µs) from=127.0.0.1:52564
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.772265 [ERR] http: Request GET /v1/acl/update, error: method GET not allowed from=127.0.0.1:52564
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.772870 [DEBUG] http: Request GET /v1/acl/update (601.015µs) from=127.0.0.1:52564
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.781197 [DEBUG] http: Request PUT /v1/acl/update (409.678µs) from=127.0.0.1:52566
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.784139 [ERR] http: Request POST /v1/acl/update, error: method POST not allowed from=127.0.0.1:52568
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.784812 [DEBUG] http: Request POST /v1/acl/update (672.351µs) from=127.0.0.1:52568
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.787565 [ERR] http: Request DELETE /v1/acl/update, error: method DELETE not allowed from=127.0.0.1:52570
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.788197 [DEBUG] http: Request DELETE /v1/acl/update (635.35µs) from=127.0.0.1:52570
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.791126 [ERR] http: Request HEAD /v1/acl/update, error: method HEAD not allowed from=127.0.0.1:52572
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.791282 [DEBUG] http: Request HEAD /v1/acl/update (178.671µs) from=127.0.0.1:52572
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.792786 [DEBUG] http: Request OPTIONS /v1/acl/update (14.334µs) from=127.0.0.1:52572
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.794222 [ERR] http: Request GET /v1/agent/check/update/, error: method GET not allowed from=127.0.0.1:52572
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.794960 [DEBUG] http: Request GET /v1/agent/check/update/ (726.352µs) from=127.0.0.1:52572
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.799256 [DEBUG] http: Request PUT /v1/agent/check/update/ (496.347µs) from=127.0.0.1:52574
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.805495 [ERR] http: Request POST /v1/agent/check/update/, error: method POST not allowed from=127.0.0.1:52576
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.806103 [DEBUG] http: Request POST /v1/agent/check/update/ (601.683µs) from=127.0.0.1:52576
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.810312 [ERR] http: Request DELETE /v1/agent/check/update/, error: method DELETE not allowed from=127.0.0.1:52578
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.810940 [DEBUG] http: Request DELETE /v1/agent/check/update/ (626.684µs) from=127.0.0.1:52578
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.815719 [ERR] http: Request HEAD /v1/agent/check/update/, error: method HEAD not allowed from=127.0.0.1:52580
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.815869 [DEBUG] http: Request HEAD /v1/agent/check/update/ (182.338µs) from=127.0.0.1:52580
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.817181 [DEBUG] http: Request OPTIONS /v1/agent/check/update/ (14.333µs) from=127.0.0.1:52580
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.818598 [ERR] http: Request GET /v1/event/fire/, error: method GET not allowed from=127.0.0.1:52580
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.819218 [DEBUG] http: Request GET /v1/event/fire/ (615.683µs) from=127.0.0.1:52580
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.823630 [DEBUG] http: Request PUT /v1/event/fire/ (513.013µs) from=127.0.0.1:52582
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.831960 [ERR] http: Request POST /v1/event/fire/, error: method POST not allowed from=127.0.0.1:52584
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.832582 [DEBUG] http: Request POST /v1/event/fire/ (624.35µs) from=127.0.0.1:52584
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.838133 [ERR] http: Request DELETE /v1/event/fire/, error: method DELETE not allowed from=127.0.0.1:52586
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.838723 [DEBUG] http: Request DELETE /v1/event/fire/ (597.016µs) from=127.0.0.1:52586
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.841584 [ERR] http: Request HEAD /v1/event/fire/, error: method HEAD not allowed from=127.0.0.1:52588
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.841726 [DEBUG] http: Request HEAD /v1/event/fire/ (162.338µs) from=127.0.0.1:52588
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.843819 [DEBUG] http: Request OPTIONS /v1/event/fire/ (16µs) from=127.0.0.1:52588
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.846808 [DEBUG] consul: dropping node "Node a465168d-d9e1-e400-3047-69dddd922f5b" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.847698 [DEBUG] http: Request GET /v1/catalog/nodes (1.349702ms) from=127.0.0.1:52588
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.850985 [ERR] http: Request PUT /v1/catalog/nodes, error: method PUT not allowed from=127.0.0.1:52590
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.851623 [DEBUG] http: Request PUT /v1/catalog/nodes (637.35µs) from=127.0.0.1:52590
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.859930 [ERR] http: Request POST /v1/catalog/nodes, error: method POST not allowed from=127.0.0.1:52592
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.860626 [DEBUG] http: Request POST /v1/catalog/nodes (695.352µs) from=127.0.0.1:52592
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.865149 [ERR] http: Request DELETE /v1/catalog/nodes, error: method DELETE not allowed from=127.0.0.1:52594
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.865838 [DEBUG] http: Request DELETE /v1/catalog/nodes (690.018µs) from=127.0.0.1:52594
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.869469 [ERR] http: Request HEAD /v1/catalog/nodes, error: method HEAD not allowed from=127.0.0.1:52596
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.869702 [DEBUG] http: Request HEAD /v1/catalog/nodes (321.675µs) from=127.0.0.1:52596
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.871590 [DEBUG] http: Request OPTIONS /v1/catalog/nodes (17.667µs) from=127.0.0.1:52596
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.874125 [DEBUG] http: Request GET /v1/acl/info/ (622.683µs) from=127.0.0.1:52596
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.895158 [ERR] http: Request PUT /v1/acl/info/, error: method PUT not allowed from=127.0.0.1:52598
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.896147 [DEBUG] http: Request PUT /v1/acl/info/ (992.36µs) from=127.0.0.1:52598
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.900236 [ERR] http: Request POST /v1/acl/info/, error: method POST not allowed from=127.0.0.1:52600
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.900863 [DEBUG] http: Request POST /v1/acl/info/ (625.35µs) from=127.0.0.1:52600
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.903629 [ERR] http: Request DELETE /v1/acl/info/, error: method DELETE not allowed from=127.0.0.1:52602
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.904233 [DEBUG] http: Request DELETE /v1/acl/info/ (597.35µs) from=127.0.0.1:52602
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.907254 [ERR] http: Request HEAD /v1/acl/info/, error: method HEAD not allowed from=127.0.0.1:52604
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.907495 [DEBUG] http: Request HEAD /v1/acl/info/ (259.673µs) from=127.0.0.1:52604
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.909036 [DEBUG] http: Request OPTIONS /v1/acl/info/ (18.667µs) from=127.0.0.1:52604
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.911118 [ERR] http: Request GET /v1/acl/tokens, error: Permission denied from=127.0.0.1:52604
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.911777 [DEBUG] http: Request GET /v1/acl/tokens (1.110696ms) from=127.0.0.1:52604
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.914811 [ERR] http: Request PUT /v1/acl/tokens, error: method PUT not allowed from=127.0.0.1:52606
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.915442 [DEBUG] http: Request PUT /v1/acl/tokens (632.35µs) from=127.0.0.1:52606
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.919306 [ERR] http: Request POST /v1/acl/tokens, error: method POST not allowed from=127.0.0.1:52608
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.920015 [DEBUG] http: Request POST /v1/acl/tokens (695.019µs) from=127.0.0.1:52608
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.923162 [ERR] http: Request DELETE /v1/acl/tokens, error: method DELETE not allowed from=127.0.0.1:52610
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.923709 [DEBUG] http: Request DELETE /v1/acl/tokens (549.348µs) from=127.0.0.1:52610
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.926552 [ERR] http: Request HEAD /v1/acl/tokens, error: method HEAD not allowed from=127.0.0.1:52612
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.926712 [DEBUG] http: Request HEAD /v1/acl/tokens (180.005µs) from=127.0.0.1:52612
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.928311 [DEBUG] http: Request OPTIONS /v1/acl/tokens (18.001µs) from=127.0.0.1:52612
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.933169 [DEBUG] http: Request GET /v1/agent/services (3.208418ms) from=127.0.0.1:52612
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.936768 [ERR] http: Request PUT /v1/agent/services, error: method PUT not allowed from=127.0.0.1:52614
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.937470 [DEBUG] http: Request PUT /v1/agent/services (698.352µs) from=127.0.0.1:52614
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.940544 [ERR] http: Request POST /v1/agent/services, error: method POST not allowed from=127.0.0.1:52616
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.941175 [DEBUG] http: Request POST /v1/agent/services (632.35µs) from=127.0.0.1:52616
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.944167 [ERR] http: Request DELETE /v1/agent/services, error: method DELETE not allowed from=127.0.0.1:52618
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.944807 [DEBUG] http: Request DELETE /v1/agent/services (637.017µs) from=127.0.0.1:52618
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.947762 [ERR] http: Request HEAD /v1/agent/services, error: method HEAD not allowed from=127.0.0.1:52620
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.947912 [DEBUG] http: Request HEAD /v1/agent/services (163.671µs) from=127.0.0.1:52620
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.950037 [DEBUG] http: Request OPTIONS /v1/agent/services (95.336µs) from=127.0.0.1:52620
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.952214 [DEBUG] http: Request GET /v1/catalog/service/ (444.678µs) from=127.0.0.1:52620
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.955651 [ERR] http: Request PUT /v1/catalog/service/, error: method PUT not allowed from=127.0.0.1:52622
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.956262 [DEBUG] http: Request PUT /v1/catalog/service/ (611.349µs) from=127.0.0.1:52622
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.959219 [ERR] http: Request POST /v1/catalog/service/, error: method POST not allowed from=127.0.0.1:52624
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.959942 [DEBUG] http: Request POST /v1/catalog/service/ (719.686µs) from=127.0.0.1:52624
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.967021 [ERR] http: Request DELETE /v1/catalog/service/, error: method DELETE not allowed from=127.0.0.1:52626
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.967677 [DEBUG] http: Request DELETE /v1/catalog/service/ (650.351µs) from=127.0.0.1:52626
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.970916 [ERR] http: Request HEAD /v1/catalog/service/, error: method HEAD not allowed from=127.0.0.1:52628
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.971066 [DEBUG] http: Request HEAD /v1/catalog/service/ (163.004µs) from=127.0.0.1:52628
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.972840 [DEBUG] http: Request OPTIONS /v1/catalog/service/ (18.667µs) from=127.0.0.1:52628
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.975056 [DEBUG] http: Request GET /v1/kv/ (394.01µs) from=127.0.0.1:52628
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.978455 [DEBUG] http: Request PUT /v1/kv/ (482.346µs) from=127.0.0.1:52630
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.981398 [ERR] http: Request POST /v1/kv/, error: method POST not allowed from=127.0.0.1:52632
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.981957 [DEBUG] http: Request POST /v1/kv/ (561.349µs) from=127.0.0.1:52632
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.985324 [DEBUG] http: Request DELETE /v1/kv/ (437.679µs) from=127.0.0.1:52634
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.988076 [ERR] http: Request HEAD /v1/kv/, error: method HEAD not allowed from=127.0.0.1:52636
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.988221 [DEBUG] http: Request HEAD /v1/kv/ (162.337µs) from=127.0.0.1:52636
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.989851 [DEBUG] http: Request OPTIONS /v1/kv/ (15µs) from=127.0.0.1:52636
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.992006 [ERR] http: Request GET /v1/acl/token/self, error: ACL not found from=127.0.0.1:52636
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.992643 [DEBUG] http: Request GET /v1/acl/token/self (1.044695ms) from=127.0.0.1:52636
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.995928 [ERR] http: Request PUT /v1/acl/token/self, error: method PUT not allowed from=127.0.0.1:52638
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.996565 [DEBUG] http: Request PUT /v1/acl/token/self (631.017µs) from=127.0.0.1:52638
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:02.999784 [ERR] http: Request POST /v1/acl/token/self, error: method POST not allowed from=127.0.0.1:52640
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.000402 [DEBUG] http: Request POST /v1/acl/token/self (628.683µs) from=127.0.0.1:52640
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.003530 [ERR] http: Request DELETE /v1/acl/token/self, error: method DELETE not allowed from=127.0.0.1:52642
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.004162 [DEBUG] http: Request DELETE /v1/acl/token/self (637.683µs) from=127.0.0.1:52642
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.007204 [ERR] http: Request HEAD /v1/acl/token/self, error: method HEAD not allowed from=127.0.0.1:52644
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.007359 [DEBUG] http: Request HEAD /v1/acl/token/self (181.672µs) from=127.0.0.1:52644
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.008741 [DEBUG] http: Request OPTIONS /v1/acl/token/self (14.667µs) from=127.0.0.1:52644
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.010762 [DEBUG] http: Request GET /v1/catalog/connect/ (546.015µs) from=127.0.0.1:52644
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.013840 [ERR] http: Request PUT /v1/catalog/connect/, error: method PUT not allowed from=127.0.0.1:52646
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.014555 [DEBUG] http: Request PUT /v1/catalog/connect/ (712.352µs) from=127.0.0.1:52646
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.017722 [ERR] http: Request POST /v1/catalog/connect/, error: method POST not allowed from=127.0.0.1:52648
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.018443 [DEBUG] http: Request POST /v1/catalog/connect/ (717.019µs) from=127.0.0.1:52648
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.021453 [ERR] http: Request DELETE /v1/catalog/connect/, error: method DELETE not allowed from=127.0.0.1:52650
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.022078 [DEBUG] http: Request DELETE /v1/catalog/connect/ (623.017µs) from=127.0.0.1:52650
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.024907 [ERR] http: Request HEAD /v1/catalog/connect/, error: method HEAD not allowed from=127.0.0.1:52652
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.025046 [DEBUG] http: Request HEAD /v1/catalog/connect/ (157.338µs) from=127.0.0.1:52652
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.026509 [DEBUG] http: Request OPTIONS /v1/catalog/connect/ (16µs) from=127.0.0.1:52652
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.028006 [ERR] http: Request GET /v1/acl/destroy/, error: method GET not allowed from=127.0.0.1:52652
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.028595 [DEBUG] http: Request GET /v1/acl/destroy/ (587.683µs) from=127.0.0.1:52652
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.032300 [DEBUG] http: Request PUT /v1/acl/destroy/ (546.014µs) from=127.0.0.1:52654
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.041177 [ERR] http: Request POST /v1/acl/destroy/, error: method POST not allowed from=127.0.0.1:52656
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.041677 [DEBUG] http: Request POST /v1/acl/destroy/ (509.013µs) from=127.0.0.1:52656
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.045003 [ERR] http: Request DELETE /v1/acl/destroy/, error: method DELETE not allowed from=127.0.0.1:52658
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.045632 [DEBUG] http: Request DELETE /v1/acl/destroy/ (632.684µs) from=127.0.0.1:52658
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.048614 [ERR] http: Request HEAD /v1/acl/destroy/, error: method HEAD not allowed from=127.0.0.1:52660
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.048939 [DEBUG] http: Request HEAD /v1/acl/destroy/ (341.342µs) from=127.0.0.1:52660
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.052788 [DEBUG] http: Request OPTIONS /v1/acl/destroy/ (18.667µs) from=127.0.0.1:52660
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.059239 [ERR] http: Request GET /v1/acl/auth-methods, error: Permission denied from=127.0.0.1:52660
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.060090 [DEBUG] http: Request GET /v1/acl/auth-methods (1.351702ms) from=127.0.0.1:52660
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.063516 [ERR] http: Request PUT /v1/acl/auth-methods, error: method PUT not allowed from=127.0.0.1:52662
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.064228 [DEBUG] http: Request PUT /v1/acl/auth-methods (707.685µs) from=127.0.0.1:52662
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.067472 [ERR] http: Request POST /v1/acl/auth-methods, error: method POST not allowed from=127.0.0.1:52664
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.068173 [DEBUG] http: Request POST /v1/acl/auth-methods (696.018µs) from=127.0.0.1:52664
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.071566 [ERR] http: Request DELETE /v1/acl/auth-methods, error: method DELETE not allowed from=127.0.0.1:52666
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.072274 [DEBUG] http: Request DELETE /v1/acl/auth-methods (695.685µs) from=127.0.0.1:52666
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.077647 [ERR] http: Request HEAD /v1/acl/auth-methods, error: method HEAD not allowed from=127.0.0.1:52668
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.077810 [DEBUG] http: Request HEAD /v1/acl/auth-methods (181.338µs) from=127.0.0.1:52668
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.079617 [DEBUG] http: Request OPTIONS /v1/acl/auth-methods (17.334µs) from=127.0.0.1:52668
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.081666 [DEBUG] consul: dropping service "consul" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.082399 [DEBUG] http: Request GET /v1/catalog/services (1.222032ms) from=127.0.0.1:52668
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.085308 [ERR] http: Request PUT /v1/catalog/services, error: method PUT not allowed from=127.0.0.1:52670
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.086000 [DEBUG] http: Request PUT /v1/catalog/services (714.685µs) from=127.0.0.1:52670
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.089291 [ERR] http: Request POST /v1/catalog/services, error: method POST not allowed from=127.0.0.1:52672
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.089957 [DEBUG] http: Request POST /v1/catalog/services (674.351µs) from=127.0.0.1:52672
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.093249 [ERR] http: Request DELETE /v1/catalog/services, error: method DELETE not allowed from=127.0.0.1:52674
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.094148 [DEBUG] http: Request DELETE /v1/catalog/services (891.357µs) from=127.0.0.1:52674
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.101502 [ERR] http: Request HEAD /v1/catalog/services, error: method HEAD not allowed from=127.0.0.1:52676
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.101655 [DEBUG] http: Request HEAD /v1/catalog/services (174.672µs) from=127.0.0.1:52676
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.103488 [DEBUG] http: Request OPTIONS /v1/catalog/services (16.667µs) from=127.0.0.1:52676
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.105556 [ERR] http: Request GET /v1/operator/autopilot/health, error: Permission denied from=127.0.0.1:52676
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.106166 [DEBUG] http: Request GET /v1/operator/autopilot/health (1.138697ms) from=127.0.0.1:52676
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.110090 [ERR] http: Request PUT /v1/operator/autopilot/health, error: method PUT not allowed from=127.0.0.1:52678
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.110748 [DEBUG] http: Request PUT /v1/operator/autopilot/health (656.684µs) from=127.0.0.1:52678
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.113700 [ERR] http: Request POST /v1/operator/autopilot/health, error: method POST not allowed from=127.0.0.1:52680
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.114350 [DEBUG] http: Request POST /v1/operator/autopilot/health (662.018µs) from=127.0.0.1:52680
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.117332 [ERR] http: Request DELETE /v1/operator/autopilot/health, error: method DELETE not allowed from=127.0.0.1:52682
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.118161 [DEBUG] http: Request DELETE /v1/operator/autopilot/health (816.022µs) from=127.0.0.1:52682
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.124562 [ERR] http: Request HEAD /v1/operator/autopilot/health, error: method HEAD not allowed from=127.0.0.1:52684
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.124840 [DEBUG] http: Request HEAD /v1/operator/autopilot/health (304.341µs) from=127.0.0.1:52684
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.126557 [DEBUG] http: Request OPTIONS /v1/operator/autopilot/health (18µs) from=127.0.0.1:52684
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.128231 [ERR] http: Request GET /v1/acl/policy, error: method GET not allowed from=127.0.0.1:52684
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.128896 [DEBUG] http: Request GET /v1/acl/policy (668.351µs) from=127.0.0.1:52684
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.131948 [ERR] http: Request PUT /v1/acl/policy, error: Bad request: Policy decoding failed: EOF from=127.0.0.1:52686
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.132635 [DEBUG] http: Request PUT /v1/acl/policy (725.685µs) from=127.0.0.1:52686
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.136120 [ERR] http: Request POST /v1/acl/policy, error: method POST not allowed from=127.0.0.1:52688
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.136759 [DEBUG] http: Request POST /v1/acl/policy (639.683µs) from=127.0.0.1:52688
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.140032 [ERR] http: Request DELETE /v1/acl/policy, error: method DELETE not allowed from=127.0.0.1:52690
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.140793 [DEBUG] http: Request DELETE /v1/acl/policy (762.353µs) from=127.0.0.1:52690
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.143743 [ERR] http: Request HEAD /v1/acl/policy, error: method HEAD not allowed from=127.0.0.1:52692
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.143883 [DEBUG] http: Request HEAD /v1/acl/policy (159.337µs) from=127.0.0.1:52692
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.145483 [DEBUG] http: Request OPTIONS /v1/acl/policy (15.001µs) from=127.0.0.1:52692
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.147578 [ERR] http: Request GET /v1/operator/keyring, error: Reading keyring denied by ACLs from=127.0.0.1:52692
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.148192 [DEBUG] http: Request GET /v1/operator/keyring (1.187031ms) from=127.0.0.1:52692
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.152215 [DEBUG] http: Request PUT /v1/operator/keyring (571.348µs) from=127.0.0.1:52694
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.155888 [DEBUG] http: Request POST /v1/operator/keyring (505.346µs) from=127.0.0.1:52696
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.159603 [DEBUG] http: Request DELETE /v1/operator/keyring (510.347µs) from=127.0.0.1:52698
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.162815 [ERR] http: Request HEAD /v1/operator/keyring, error: method HEAD not allowed from=127.0.0.1:52700
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.162955 [DEBUG] http: Request HEAD /v1/operator/keyring (160.671µs) from=127.0.0.1:52700
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.164504 [DEBUG] http: Request OPTIONS /v1/operator/keyring (20.001µs) from=127.0.0.1:52700
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.166063 [ERR] http: Request GET /v1/agent/host, error: Permission denied from=127.0.0.1:52700
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.166689 [DEBUG] http: Request GET /v1/agent/host (739.353µs) from=127.0.0.1:52700
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.169859 [ERR] http: Request PUT /v1/agent/host, error: method PUT not allowed from=127.0.0.1:52702
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.170694 [DEBUG] http: Request PUT /v1/agent/host (835.689µs) from=127.0.0.1:52702
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.173993 [ERR] http: Request POST /v1/agent/host, error: method POST not allowed from=127.0.0.1:52704
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.174911 [DEBUG] http: Request POST /v1/agent/host (908.024µs) from=127.0.0.1:52704
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.177861 [ERR] http: Request DELETE /v1/agent/host, error: method DELETE not allowed from=127.0.0.1:52706
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.178440 [DEBUG] http: Request DELETE /v1/agent/host (578.016µs) from=127.0.0.1:52706
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.181279 [ERR] http: Request HEAD /v1/agent/host, error: method HEAD not allowed from=127.0.0.1:52708
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.181429 [DEBUG] http: Request HEAD /v1/agent/host (168.338µs) from=127.0.0.1:52708
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.183163 [DEBUG] http: Request OPTIONS /v1/agent/host (14.667µs) from=127.0.0.1:52708
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.184609 [ERR] http: Request GET /v1/agent/service/deregister/, error: method GET not allowed from=127.0.0.1:52708
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.185204 [DEBUG] http: Request GET /v1/agent/service/deregister/ (606.016µs) from=127.0.0.1:52708
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.188293 [ERR] http: Request PUT /v1/agent/service/deregister/, error: Unknown service "" from=127.0.0.1:52710
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.188882 [DEBUG] http: Request PUT /v1/agent/service/deregister/ (726.353µs) from=127.0.0.1:52710
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.191788 [ERR] http: Request POST /v1/agent/service/deregister/, error: method POST not allowed from=127.0.0.1:52712
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.192479 [DEBUG] http: Request POST /v1/agent/service/deregister/ (684.685µs) from=127.0.0.1:52712
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.195715 [ERR] http: Request DELETE /v1/agent/service/deregister/, error: method DELETE not allowed from=127.0.0.1:52714
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.196403 [DEBUG] http: Request DELETE /v1/agent/service/deregister/ (675.351µs) from=127.0.0.1:52714
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.199247 [ERR] http: Request HEAD /v1/agent/service/deregister/, error: method HEAD not allowed from=127.0.0.1:52716
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.199535 [DEBUG] http: Request HEAD /v1/agent/service/deregister/ (314.008µs) from=127.0.0.1:52716
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.201203 [DEBUG] http: Request OPTIONS /v1/agent/service/deregister/ (17.334µs) from=127.0.0.1:52716
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.202620 [ERR] http: Request GET /v1/catalog/deregister, error: method GET not allowed from=127.0.0.1:52716
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.203553 [DEBUG] http: Request GET /v1/catalog/deregister (936.358µs) from=127.0.0.1:52716
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.207267 [DEBUG] http: Request PUT /v1/catalog/deregister (535.681µs) from=127.0.0.1:52718
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.210580 [ERR] http: Request POST /v1/catalog/deregister, error: method POST not allowed from=127.0.0.1:52720
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.211247 [DEBUG] http: Request POST /v1/catalog/deregister (666.351µs) from=127.0.0.1:52720
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.214249 [ERR] http: Request DELETE /v1/catalog/deregister, error: method DELETE not allowed from=127.0.0.1:52722
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.215159 [DEBUG] http: Request DELETE /v1/catalog/deregister (910.691µs) from=127.0.0.1:52722
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.219100 [ERR] http: Request HEAD /v1/catalog/deregister, error: method HEAD not allowed from=127.0.0.1:52724
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.219260 [DEBUG] http: Request HEAD /v1/catalog/deregister (182.338µs) from=127.0.0.1:52724
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.221573 [DEBUG] http: Request OPTIONS /v1/catalog/deregister (15.667µs) from=127.0.0.1:52724
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.223392 [ERR] http: Request GET /v1/acl/rules/translate/, error: Bad request: Missing token ID from=127.0.0.1:52724
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.223965 [DEBUG] http: Request GET /v1/acl/rules/translate/ (576.682µs) from=127.0.0.1:52724
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.227167 [ERR] http: Request PUT /v1/acl/rules/translate/, error: method PUT not allowed from=127.0.0.1:52726
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.227698 [DEBUG] http: Request PUT /v1/acl/rules/translate/ (554.348µs) from=127.0.0.1:52726
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.240761 [ERR] http: Request POST /v1/acl/rules/translate/, error: method POST not allowed from=127.0.0.1:52728
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.241446 [DEBUG] http: Request POST /v1/acl/rules/translate/ (680.018µs) from=127.0.0.1:52728
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.247446 [ERR] http: Request DELETE /v1/acl/rules/translate/, error: method DELETE not allowed from=127.0.0.1:52730
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.247995 [DEBUG] http: Request DELETE /v1/acl/rules/translate/ (555.015µs) from=127.0.0.1:52730
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.251196 [ERR] http: Request HEAD /v1/acl/rules/translate/, error: method HEAD not allowed from=127.0.0.1:52732
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.251366 [DEBUG] http: Request HEAD /v1/acl/rules/translate/ (187.005µs) from=127.0.0.1:52732
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.253041 [DEBUG] http: Request OPTIONS /v1/acl/rules/translate/ (15.667µs) from=127.0.0.1:52732
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.255094 [ERR] http: Request GET /v1/connect/ca/configuration, error: Permission denied from=127.0.0.1:52732
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.255587 [DEBUG] http: Request GET /v1/connect/ca/configuration (935.692µs) from=127.0.0.1:52732
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.260443 [DEBUG] http: Request PUT /v1/connect/ca/configuration (556.349µs) from=127.0.0.1:52734
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.264723 [ERR] http: Request POST /v1/connect/ca/configuration, error: method POST not allowed from=127.0.0.1:52736
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.265253 [DEBUG] http: Request POST /v1/connect/ca/configuration (543.015µs) from=127.0.0.1:52736
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.268144 [ERR] http: Request DELETE /v1/connect/ca/configuration, error: method DELETE not allowed from=127.0.0.1:52738
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.269013 [DEBUG] http: Request DELETE /v1/connect/ca/configuration (860.689µs) from=127.0.0.1:52738
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.272238 [ERR] http: Request HEAD /v1/connect/ca/configuration, error: method HEAD not allowed from=127.0.0.1:52740
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.272402 [DEBUG] http: Request HEAD /v1/connect/ca/configuration (185.338µs) from=127.0.0.1:52740
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.274063 [DEBUG] http: Request OPTIONS /v1/connect/ca/configuration (16.334µs) from=127.0.0.1:52740
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.276778 [DEBUG] http: Request GET /v1/coordinate/nodes (1.105696ms) from=127.0.0.1:52740
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.280025 [ERR] http: Request PUT /v1/coordinate/nodes, error: method PUT not allowed from=127.0.0.1:52742
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.280660 [DEBUG] http: Request PUT /v1/coordinate/nodes (633.684µs) from=127.0.0.1:52742
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.285169 [ERR] http: Request POST /v1/coordinate/nodes, error: method POST not allowed from=127.0.0.1:52744
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.285818 [DEBUG] http: Request POST /v1/coordinate/nodes (655.684µs) from=127.0.0.1:52744
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.290463 [ERR] http: Request DELETE /v1/coordinate/nodes, error: method DELETE not allowed from=127.0.0.1:52746
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.291204 [DEBUG] http: Request DELETE /v1/coordinate/nodes (738.019µs) from=127.0.0.1:52746
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.296287 [ERR] http: Request HEAD /v1/coordinate/nodes, error: method HEAD not allowed from=127.0.0.1:52748
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.296498 [DEBUG] http: Request HEAD /v1/coordinate/nodes (234.007µs) from=127.0.0.1:52748
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.298931 [DEBUG] http: Request OPTIONS /v1/coordinate/nodes (19.334µs) from=127.0.0.1:52748
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.302015 [DEBUG] consul: dropping node "Node a465168d-d9e1-e400-3047-69dddd922f5b" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.303011 [DEBUG] http: Request GET /v1/internal/ui/services (1.673045ms) from=127.0.0.1:52748
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.307224 [ERR] http: Request PUT /v1/internal/ui/services, error: method PUT not allowed from=127.0.0.1:52750
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.308000 [DEBUG] http: Request PUT /v1/internal/ui/services (768.687µs) from=127.0.0.1:52750
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.311162 [ERR] http: Request POST /v1/internal/ui/services, error: method POST not allowed from=127.0.0.1:52752
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.311864 [DEBUG] http: Request POST /v1/internal/ui/services (700.352µs) from=127.0.0.1:52752
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.330495 [ERR] http: Request DELETE /v1/internal/ui/services, error: method DELETE not allowed from=127.0.0.1:52754
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.331139 [DEBUG] http: Request DELETE /v1/internal/ui/services (637.35µs) from=127.0.0.1:52754
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.334334 [ERR] http: Request HEAD /v1/internal/ui/services, error: method HEAD not allowed from=127.0.0.1:52756
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.334554 [DEBUG] http: Request HEAD /v1/internal/ui/services (239.673µs) from=127.0.0.1:52756
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.336019 [DEBUG] http: Request OPTIONS /v1/internal/ui/services (17.667µs) from=127.0.0.1:52756
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.338106 [DEBUG] http: Request GET /v1/health/state/ (537.347µs) from=127.0.0.1:52756
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.345397 [ERR] http: Request PUT /v1/health/state/, error: method PUT not allowed from=127.0.0.1:52758
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.346533 [DEBUG] http: Request PUT /v1/health/state/ (1.194698ms) from=127.0.0.1:52758
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.355552 [ERR] http: Request POST /v1/health/state/, error: method POST not allowed from=127.0.0.1:52760
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.356243 [DEBUG] http: Request POST /v1/health/state/ (684.018µs) from=127.0.0.1:52760
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.359556 [ERR] http: Request DELETE /v1/health/state/, error: method DELETE not allowed from=127.0.0.1:52762
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.360127 [DEBUG] http: Request DELETE /v1/health/state/ (577.682µs) from=127.0.0.1:52762
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.363033 [ERR] http: Request HEAD /v1/health/state/, error: method HEAD not allowed from=127.0.0.1:52764
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.363175 [DEBUG] http: Request HEAD /v1/health/state/ (167.671µs) from=127.0.0.1:52764
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.364899 [DEBUG] http: Request OPTIONS /v1/health/state/ (14.667µs) from=127.0.0.1:52764
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.366997 [DEBUG] http: Request GET /v1/health/connect/ (553.681µs) from=127.0.0.1:52764
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.370120 [ERR] http: Request PUT /v1/health/connect/, error: method PUT not allowed from=127.0.0.1:52766
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.370687 [DEBUG] http: Request PUT /v1/health/connect/ (573.682µs) from=127.0.0.1:52766
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.373881 [ERR] http: Request POST /v1/health/connect/, error: method POST not allowed from=127.0.0.1:52768
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.374526 [DEBUG] http: Request POST /v1/health/connect/ (645.017µs) from=127.0.0.1:52768
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.377884 [ERR] http: Request DELETE /v1/health/connect/, error: method DELETE not allowed from=127.0.0.1:52770
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.378627 [DEBUG] http: Request DELETE /v1/health/connect/ (760.687µs) from=127.0.0.1:52770
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.381924 [ERR] http: Request HEAD /v1/health/connect/, error: method HEAD not allowed from=127.0.0.1:52772
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.382066 [DEBUG] http: Request HEAD /v1/health/connect/ (161.337µs) from=127.0.0.1:52772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.383738 [DEBUG] http: Request OPTIONS /v1/health/connect/ (19µs) from=127.0.0.1:52772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.385874 [DEBUG] http: Request GET /v1/agent/connect/proxy/ (530.681µs) from=127.0.0.1:52772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.389057 [ERR] http: Request PUT /v1/agent/connect/proxy/, error: method PUT not allowed from=127.0.0.1:52774
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.389680 [DEBUG] http: Request PUT /v1/agent/connect/proxy/ (635.684µs) from=127.0.0.1:52774
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.392608 [ERR] http: Request POST /v1/agent/connect/proxy/, error: method POST not allowed from=127.0.0.1:52776
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.393160 [DEBUG] http: Request POST /v1/agent/connect/proxy/ (554.348µs) from=127.0.0.1:52776
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.396232 [ERR] http: Request DELETE /v1/agent/connect/proxy/, error: method DELETE not allowed from=127.0.0.1:52778
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.396754 [DEBUG] http: Request DELETE /v1/agent/connect/proxy/ (536.347µs) from=127.0.0.1:52778
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.399971 [ERR] http: Request HEAD /v1/agent/connect/proxy/, error: method HEAD not allowed from=127.0.0.1:52780
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.400121 [DEBUG] http: Request HEAD /v1/agent/connect/proxy/ (168.338µs) from=127.0.0.1:52780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.401669 [DEBUG] http: Request OPTIONS /v1/agent/connect/proxy/ (14.667µs) from=127.0.0.1:52780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.403142 [ERR] http: Request GET /v1/acl/policy/, error: Bad request: Missing policy ID from=127.0.0.1:52780
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.403648 [DEBUG] http: Request GET /v1/acl/policy/ (519.014µs) from=127.0.0.1:52780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.407180 [ERR] http: Request PUT /v1/acl/policy/, error: Bad request: Policy decoding failed: EOF from=127.0.0.1:52782
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.407761 [DEBUG] http: Request PUT /v1/acl/policy/ (695.685µs) from=127.0.0.1:52782
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.416588 [ERR] http: Request POST /v1/acl/policy/, error: method POST not allowed from=127.0.0.1:52784
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.417292 [DEBUG] http: Request POST /v1/acl/policy/ (707.685µs) from=127.0.0.1:52784
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.423073 [ERR] http: Request DELETE /v1/acl/policy/, error: Bad request: Missing policy ID from=127.0.0.1:52786
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.423607 [DEBUG] http: Request DELETE /v1/acl/policy/ (541.014µs) from=127.0.0.1:52786
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.427198 [ERR] http: Request HEAD /v1/acl/policy/, error: method HEAD not allowed from=127.0.0.1:52788
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.427337 [DEBUG] http: Request HEAD /v1/acl/policy/ (166.338µs) from=127.0.0.1:52788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.428786 [DEBUG] http: Request OPTIONS /v1/acl/policy/ (16.001µs) from=127.0.0.1:52788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.430569 [ERR] http: Request GET /v1/acl/token/, error: Bad request: Missing token ID from=127.0.0.1:52788
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.431057 [DEBUG] http: Request GET /v1/acl/token/ (495.68µs) from=127.0.0.1:52788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.434110 [ERR] http: Request PUT /v1/acl/token/, error: Bad request: Token decoding failed: EOF from=127.0.0.1:52790
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.435040 [DEBUG] http: Request PUT /v1/acl/token/ (965.692µs) from=127.0.0.1:52790
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.440747 [ERR] http: Request POST /v1/acl/token/, error: method POST not allowed from=127.0.0.1:52792
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.441294 [DEBUG] http: Request POST /v1/acl/token/ (558.348µs) from=127.0.0.1:52792
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.444609 [ERR] http: Request DELETE /v1/acl/token/, error: Bad request: Missing token ID from=127.0.0.1:52794
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.445459 [DEBUG] http: Request DELETE /v1/acl/token/ (877.023µs) from=127.0.0.1:52794
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.451063 [ERR] http: Request HEAD /v1/acl/token/, error: method HEAD not allowed from=127.0.0.1:52796
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.451354 [DEBUG] http: Request HEAD /v1/acl/token/ (307.675µs) from=127.0.0.1:52796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.453729 [DEBUG] http: Request OPTIONS /v1/acl/token/ (21µs) from=127.0.0.1:52796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.456542 [ERR] http: Request GET /v1/snapshot, error: Permission denied from=127.0.0.1:52796
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.457171 [DEBUG] http: Request GET /v1/snapshot (930.024µs) from=127.0.0.1:52796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.460691 [ERR] http: Request PUT /v1/snapshot, error: Permission denied from=127.0.0.1:52798
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.461316 [DEBUG] http: Request PUT /v1/snapshot (950.691µs) from=127.0.0.1:52798
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.464474 [ERR] http: Request POST /v1/snapshot, error: method POST not allowed from=127.0.0.1:52800
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.465098 [DEBUG] http: Request POST /v1/snapshot (685.019µs) from=127.0.0.1:52800
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.468797 [ERR] http: Request DELETE /v1/snapshot, error: method DELETE not allowed from=127.0.0.1:52802
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.474956 [DEBUG] http: Request DELETE /v1/snapshot (6.149163ms) from=127.0.0.1:52802
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.478538 [ERR] http: Request HEAD /v1/snapshot, error: method HEAD not allowed from=127.0.0.1:52804
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.478938 [DEBUG] http: Request HEAD /v1/snapshot (428.678µs) from=127.0.0.1:52804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.480740 [DEBUG] http: Request OPTIONS /v1/snapshot (16µs) from=127.0.0.1:52804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.482702 [ERR] http: Request GET /v1/agent/connect/ca/leaf/, error: Permission denied from=127.0.0.1:52804
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.483529 [DEBUG] http: Request GET /v1/agent/connect/ca/leaf/ (985.026µs) from=127.0.0.1:52804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.487127 [ERR] http: Request PUT /v1/agent/connect/ca/leaf/, error: method PUT not allowed from=127.0.0.1:52806
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.488027 [DEBUG] http: Request PUT /v1/agent/connect/ca/leaf/ (920.358µs) from=127.0.0.1:52806
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.495215 [ERR] http: Request POST /v1/agent/connect/ca/leaf/, error: method POST not allowed from=127.0.0.1:52808
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.495847 [DEBUG] http: Request POST /v1/agent/connect/ca/leaf/ (642.017µs) from=127.0.0.1:52808
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.504098 [ERR] http: Request DELETE /v1/agent/connect/ca/leaf/, error: method DELETE not allowed from=127.0.0.1:52810
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.504921 [DEBUG] http: Request DELETE /v1/agent/connect/ca/leaf/ (825.356µs) from=127.0.0.1:52810
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.508492 [ERR] http: Request HEAD /v1/agent/connect/ca/leaf/, error: method HEAD not allowed from=127.0.0.1:52812
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.508649 [DEBUG] http: Request HEAD /v1/agent/connect/ca/leaf/ (193.005µs) from=127.0.0.1:52812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.510655 [DEBUG] http: Request OPTIONS /v1/agent/connect/ca/leaf/ (16.667µs) from=127.0.0.1:52812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.512521 [ERR] http: Request GET /v1/acl/bootstrap, error: method GET not allowed from=127.0.0.1:52812
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.513246 [DEBUG] http: Request GET /v1/acl/bootstrap (730.019µs) from=127.0.0.1:52812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.517633 [DEBUG] http: Request PUT /v1/acl/bootstrap (875.356µs) from=127.0.0.1:52814
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.520994 [ERR] http: Request POST /v1/acl/bootstrap, error: method POST not allowed from=127.0.0.1:52816
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.521723 [DEBUG] http: Request POST /v1/acl/bootstrap (708.019µs) from=127.0.0.1:52816
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.525142 [ERR] http: Request DELETE /v1/acl/bootstrap, error: method DELETE not allowed from=127.0.0.1:52818
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.525865 [DEBUG] http: Request DELETE /v1/acl/bootstrap (632.016µs) from=127.0.0.1:52818
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.529006 [ERR] http: Request HEAD /v1/acl/bootstrap, error: method HEAD not allowed from=127.0.0.1:52820
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.529277 [DEBUG] http: Request HEAD /v1/acl/bootstrap (374.344µs) from=127.0.0.1:52820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.531017 [DEBUG] http: Request OPTIONS /v1/acl/bootstrap (17.668µs) from=127.0.0.1:52820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.533193 [ERR] http: Request GET /v1/acl/policies, error: Permission denied from=127.0.0.1:52820
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.533682 [DEBUG] http: Request GET /v1/acl/policies (998.36µs) from=127.0.0.1:52820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.536894 [ERR] http: Request PUT /v1/acl/policies, error: method PUT not allowed from=127.0.0.1:52822
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.537554 [DEBUG] http: Request PUT /v1/acl/policies (667.017µs) from=127.0.0.1:52822
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.540760 [ERR] http: Request POST /v1/acl/policies, error: method POST not allowed from=127.0.0.1:52824
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.541694 [DEBUG] http: Request POST /v1/acl/policies (941.691µs) from=127.0.0.1:52824
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.550007 [ERR] http: Request DELETE /v1/acl/policies, error: method DELETE not allowed from=127.0.0.1:52826
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.550687 [DEBUG] http: Request DELETE /v1/acl/policies (667.684µs) from=127.0.0.1:52826
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.554798 [ERR] http: Request HEAD /v1/acl/policies, error: method HEAD not allowed from=127.0.0.1:52828
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.555193 [DEBUG] http: Request HEAD /v1/acl/policies (415.344µs) from=127.0.0.1:52828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.566206 [DEBUG] http: Request OPTIONS /v1/acl/policies (17.668µs) from=127.0.0.1:52828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.567758 [ERR] http: Request GET /v1/acl/binding-rule, error: method GET not allowed from=127.0.0.1:52828
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.568611 [DEBUG] http: Request GET /v1/acl/binding-rule (831.356µs) from=127.0.0.1:52828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.575193 [ERR] http: Request PUT /v1/acl/binding-rule, error: Bad request: BindingRule decoding failed: EOF from=127.0.0.1:52830
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.575986 [DEBUG] http: Request PUT /v1/acl/binding-rule (842.355µs) from=127.0.0.1:52830
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.583801 [ERR] http: Request POST /v1/acl/binding-rule, error: method POST not allowed from=127.0.0.1:52832
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.584336 [DEBUG] http: Request POST /v1/acl/binding-rule (530.014µs) from=127.0.0.1:52832
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.588394 [ERR] http: Request DELETE /v1/acl/binding-rule, error: method DELETE not allowed from=127.0.0.1:52834
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.589124 [DEBUG] http: Request DELETE /v1/acl/binding-rule (728.686µs) from=127.0.0.1:52834
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.593038 [ERR] http: Request HEAD /v1/acl/binding-rule, error: method HEAD not allowed from=127.0.0.1:52836
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.593182 [DEBUG] http: Request HEAD /v1/acl/binding-rule (162.338µs) from=127.0.0.1:52836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.595102 [DEBUG] http: Request OPTIONS /v1/acl/binding-rule (19.001µs) from=127.0.0.1:52836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.599221 [DEBUG] http: Request GET /v1/agent/connect/ca/roots (2.306061ms) from=127.0.0.1:52836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.604205 [ERR] http: Request PUT /v1/agent/connect/ca/roots, error: method PUT not allowed from=127.0.0.1:52838
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.605021 [DEBUG] http: Request PUT /v1/agent/connect/ca/roots (807.355µs) from=127.0.0.1:52838
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.608485 [ERR] http: Request POST /v1/agent/connect/ca/roots, error: method POST not allowed from=127.0.0.1:52840
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.609368 [DEBUG] http: Request POST /v1/agent/connect/ca/roots (883.023µs) from=127.0.0.1:52840
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.613345 [ERR] http: Request DELETE /v1/agent/connect/ca/roots, error: method DELETE not allowed from=127.0.0.1:52842
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.614617 [DEBUG] http: Request DELETE /v1/agent/connect/ca/roots (1.050361ms) from=127.0.0.1:52842
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.618094 [ERR] http: Request HEAD /v1/agent/connect/ca/roots, error: method HEAD not allowed from=127.0.0.1:52844
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.618247 [DEBUG] http: Request HEAD /v1/agent/connect/ca/roots (171.671µs) from=127.0.0.1:52844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.619979 [DEBUG] http: Request OPTIONS /v1/agent/connect/ca/roots (17.334µs) from=127.0.0.1:52844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.621646 [ERR] http: Request GET /v1/session/renew/, error: method GET not allowed from=127.0.0.1:52844
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.622357 [DEBUG] http: Request GET /v1/session/renew/ (711.352µs) from=127.0.0.1:52844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.626341 [DEBUG] http: Request PUT /v1/session/renew/ (496.013µs) from=127.0.0.1:52846
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.629448 [ERR] http: Request POST /v1/session/renew/, error: method POST not allowed from=127.0.0.1:52848
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.629991 [DEBUG] http: Request POST /v1/session/renew/ (599.016µs) from=127.0.0.1:52848
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.633913 [ERR] http: Request DELETE /v1/session/renew/, error: method DELETE not allowed from=127.0.0.1:52850
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.634720 [DEBUG] http: Request DELETE /v1/session/renew/ (807.355µs) from=127.0.0.1:52850
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.640797 [ERR] http: Request HEAD /v1/session/renew/, error: method HEAD not allowed from=127.0.0.1:52852
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.640981 [DEBUG] http: Request HEAD /v1/session/renew/ (271.007µs) from=127.0.0.1:52852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.642686 [DEBUG] http: Request OPTIONS /v1/session/renew/ (15.334µs) from=127.0.0.1:52852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.648602 [ERR] http: Request GET /v1/acl/list, error: Permission denied from=127.0.0.1:52852
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.649518 [DEBUG] http: Request GET /v1/acl/list (1.331702ms) from=127.0.0.1:52852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.653927 [ERR] http: Request PUT /v1/acl/list, error: method PUT not allowed from=127.0.0.1:52854
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.655317 [DEBUG] http: Request PUT /v1/acl/list (1.380704ms) from=127.0.0.1:52854
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.659659 [ERR] http: Request POST /v1/acl/list, error: method POST not allowed from=127.0.0.1:52856
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.660866 [DEBUG] http: Request POST /v1/acl/list (1.203032ms) from=127.0.0.1:52856
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.664617 [ERR] http: Request DELETE /v1/acl/list, error: method DELETE not allowed from=127.0.0.1:52858
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.665207 [DEBUG] http: Request DELETE /v1/acl/list (595.349µs) from=127.0.0.1:52858
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.668332 [ERR] http: Request HEAD /v1/acl/list, error: method HEAD not allowed from=127.0.0.1:52860
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.668469 [DEBUG] http: Request HEAD /v1/acl/list (156.671µs) from=127.0.0.1:52860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.670186 [DEBUG] http: Request OPTIONS /v1/acl/list (15µs) from=127.0.0.1:52860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.671623 [ERR] http: Request GET /v1/agent/token/, error: method GET not allowed from=127.0.0.1:52860
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.672163 [DEBUG] http: Request GET /v1/agent/token/ (543.681µs) from=127.0.0.1:52860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.676309 [ERR] http: Request PUT /v1/agent/token/, error: Permission denied from=127.0.0.1:52862
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.676820 [DEBUG] http: Request PUT /v1/agent/token/ (633.683µs) from=127.0.0.1:52862
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.680226 [ERR] http: Request POST /v1/agent/token/, error: method POST not allowed from=127.0.0.1:52864
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.680744 [DEBUG] http: Request POST /v1/agent/token/ (523.68µs) from=127.0.0.1:52864
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.687736 [ERR] http: Request DELETE /v1/agent/token/, error: method DELETE not allowed from=127.0.0.1:52866
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.688272 [DEBUG] http: Request DELETE /v1/agent/token/ (541.681µs) from=127.0.0.1:52866
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.691622 [ERR] http: Request HEAD /v1/agent/token/, error: method HEAD not allowed from=127.0.0.1:52868
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.691769 [DEBUG] http: Request HEAD /v1/agent/token/ (169.338µs) from=127.0.0.1:52868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.694062 [DEBUG] http: Request OPTIONS /v1/agent/token/ (18.001µs) from=127.0.0.1:52868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.696381 [ERR] http: Request GET /v1/agent/leave, error: method GET not allowed from=127.0.0.1:52868
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.697005 [DEBUG] http: Request GET /v1/agent/leave (614.683µs) from=127.0.0.1:52868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.703046 [ERR] http: Request PUT /v1/agent/leave, error: Permission denied from=127.0.0.1:52870
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.703716 [DEBUG] http: Request PUT /v1/agent/leave (791.355µs) from=127.0.0.1:52870
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.707528 [ERR] http: Request POST /v1/agent/leave, error: method POST not allowed from=127.0.0.1:52872
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.708182 [DEBUG] http: Request POST /v1/agent/leave (648.017µs) from=127.0.0.1:52872
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.712514 [ERR] http: Request DELETE /v1/agent/leave, error: method DELETE not allowed from=127.0.0.1:52874
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.713238 [DEBUG] http: Request DELETE /v1/agent/leave (728.353µs) from=127.0.0.1:52874
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.716711 [ERR] http: Request HEAD /v1/agent/leave, error: method HEAD not allowed from=127.0.0.1:52876
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.716892 [DEBUG] http: Request HEAD /v1/agent/leave (205.339µs) from=127.0.0.1:52876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.718713 [DEBUG] http: Request OPTIONS /v1/agent/leave (14.667µs) from=127.0.0.1:52876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.720464 [ERR] http: Request GET /v1/agent/maintenance, error: method GET not allowed from=127.0.0.1:52876
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.721048 [DEBUG] http: Request GET /v1/agent/maintenance (600.016µs) from=127.0.0.1:52876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.724788 [DEBUG] http: Request PUT /v1/agent/maintenance (612.016µs) from=127.0.0.1:52878
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.733111 [ERR] http: Request POST /v1/agent/maintenance, error: method POST not allowed from=127.0.0.1:52880
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.733633 [DEBUG] http: Request POST /v1/agent/maintenance (546.681µs) from=127.0.0.1:52880
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.737172 [ERR] http: Request DELETE /v1/agent/maintenance, error: method DELETE not allowed from=127.0.0.1:52882
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.737676 [DEBUG] http: Request DELETE /v1/agent/maintenance (504.014µs) from=127.0.0.1:52882
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.744917 [ERR] http: Request HEAD /v1/agent/maintenance, error: method HEAD not allowed from=127.0.0.1:52884
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.745152 [DEBUG] http: Request HEAD /v1/agent/maintenance (250.007µs) from=127.0.0.1:52884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.746866 [DEBUG] http: Request OPTIONS /v1/agent/maintenance (17.334µs) from=127.0.0.1:52884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.748640 [ERR] http: Request GET /v1/agent/check/deregister/, error: method GET not allowed from=127.0.0.1:52884
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.749299 [DEBUG] http: Request GET /v1/agent/check/deregister/ (659.351µs) from=127.0.0.1:52884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.752933 [ERR] http: Request PUT /v1/agent/check/deregister/, error: Unknown check "" from=127.0.0.1:52886
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.753631 [DEBUG] http: Request PUT /v1/agent/check/deregister/ (841.023µs) from=127.0.0.1:52886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.756960 [ERR] http: Request POST /v1/agent/check/deregister/, error: method POST not allowed from=127.0.0.1:52888
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.757692 [DEBUG] http: Request POST /v1/agent/check/deregister/ (723.352µs) from=127.0.0.1:52888
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.763121 [ERR] http: Request DELETE /v1/agent/check/deregister/, error: method DELETE not allowed from=127.0.0.1:52890
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.763827 [DEBUG] http: Request DELETE /v1/agent/check/deregister/ (694.019µs) from=127.0.0.1:52890
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.769591 [ERR] http: Request HEAD /v1/agent/check/deregister/, error: method HEAD not allowed from=127.0.0.1:52892
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.769739 [DEBUG] http: Request HEAD /v1/agent/check/deregister/ (157.004µs) from=127.0.0.1:52892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.771319 [DEBUG] http: Request OPTIONS /v1/agent/check/deregister/ (16.334µs) from=127.0.0.1:52892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.774830 [ERR] http: Request GET /v1/acl/login, error: method GET not allowed from=127.0.0.1:52892
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.775825 [DEBUG] http: Request GET /v1/acl/login (985.026µs) from=127.0.0.1:52892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.779449 [ERR] http: Request PUT /v1/acl/login, error: method PUT not allowed from=127.0.0.1:52894
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.780280 [DEBUG] http: Request PUT /v1/acl/login (867.69µs) from=127.0.0.1:52894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.783754 [ERR] http: Request POST /v1/acl/login, error: Bad request: Failed to decode request body:: EOF from=127.0.0.1:52896
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.784815 [DEBUG] http: Request POST /v1/acl/login (1.076028ms) from=127.0.0.1:52896
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.788560 [ERR] http: Request DELETE /v1/acl/login, error: method DELETE not allowed from=127.0.0.1:52898
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.789519 [DEBUG] http: Request DELETE /v1/acl/login (867.69µs) from=127.0.0.1:52898
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.794251 [ERR] http: Request HEAD /v1/acl/login, error: method HEAD not allowed from=127.0.0.1:52900
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.794630 [DEBUG] http: Request HEAD /v1/acl/login (396.011µs) from=127.0.0.1:52900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.796541 [DEBUG] http: Request OPTIONS /v1/acl/login (19.668µs) from=127.0.0.1:52900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.801223 [ERR] http: Request GET /v1/acl/role/, error: Bad request: Missing role ID from=127.0.0.1:52900
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.807868 [DEBUG] http: Request GET /v1/acl/role/ (6.595175ms) from=127.0.0.1:52900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.818734 [ERR] http: Request PUT /v1/acl/role/, error: Bad request: Role decoding failed: EOF from=127.0.0.1:52902
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.819532 [DEBUG] http: Request PUT /v1/acl/role/ (829.688µs) from=127.0.0.1:52902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.826415 [ERR] http: Request POST /v1/acl/role/, error: method POST not allowed from=127.0.0.1:52904
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.827137 [DEBUG] http: Request POST /v1/acl/role/ (724.686µs) from=127.0.0.1:52904
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.830263 [ERR] http: Request DELETE /v1/acl/role/, error: Bad request: Missing role ID from=127.0.0.1:52906
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.830810 [DEBUG] http: Request DELETE /v1/acl/role/ (544.014µs) from=127.0.0.1:52906
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.834302 [ERR] http: Request HEAD /v1/acl/role/, error: method HEAD not allowed from=127.0.0.1:52908
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.834507 [DEBUG] http: Request HEAD /v1/acl/role/ (223.006µs) from=127.0.0.1:52908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.836196 [DEBUG] http: Request OPTIONS /v1/acl/role/ (19.334µs) from=127.0.0.1:52908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.837810 [ERR] http: Request GET /v1/acl/create, error: method GET not allowed from=127.0.0.1:52908
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.838490 [DEBUG] http: Request GET /v1/acl/create (680.018µs) from=127.0.0.1:52908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.842312 [ERR] http: Request PUT /v1/acl/create, error: Permission denied from=127.0.0.1:52910
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.842901 [DEBUG] http: Request PUT /v1/acl/create (1.068028ms) from=127.0.0.1:52910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.846248 [ERR] http: Request POST /v1/acl/create, error: method POST not allowed from=127.0.0.1:52912
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.846864 [DEBUG] http: Request POST /v1/acl/create (622.35µs) from=127.0.0.1:52912
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.850107 [ERR] http: Request DELETE /v1/acl/create, error: method DELETE not allowed from=127.0.0.1:52914
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.850707 [DEBUG] http: Request DELETE /v1/acl/create (591.015µs) from=127.0.0.1:52914
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.854006 [ERR] http: Request HEAD /v1/acl/create, error: method HEAD not allowed from=127.0.0.1:52916
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.854178 [DEBUG] http: Request HEAD /v1/acl/create (179.005µs) from=127.0.0.1:52916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.856063 [DEBUG] http: Request OPTIONS /v1/acl/create (14.667µs) from=127.0.0.1:52916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.860464 [ERR] http: Request GET /v1/acl/roles, error: Permission denied from=127.0.0.1:52916
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.861092 [DEBUG] http: Request GET /v1/acl/roles (1.200032ms) from=127.0.0.1:52916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.867332 [ERR] http: Request PUT /v1/acl/roles, error: method PUT not allowed from=127.0.0.1:52918
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.867877 [DEBUG] http: Request PUT /v1/acl/roles (544.015µs) from=127.0.0.1:52918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.871156 [ERR] http: Request POST /v1/acl/roles, error: method POST not allowed from=127.0.0.1:52920
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.872036 [DEBUG] http: Request POST /v1/acl/roles (859.023µs) from=127.0.0.1:52920
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.875918 [ERR] http: Request DELETE /v1/acl/roles, error: method DELETE not allowed from=127.0.0.1:52922
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.876531 [DEBUG] http: Request DELETE /v1/acl/roles (596.35µs) from=127.0.0.1:52922
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.880020 [ERR] http: Request HEAD /v1/acl/roles, error: method HEAD not allowed from=127.0.0.1:52924
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.880175 [DEBUG] http: Request HEAD /v1/acl/roles (176.338µs) from=127.0.0.1:52924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.882031 [DEBUG] http: Request OPTIONS /v1/acl/roles (16.001µs) from=127.0.0.1:52924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.883515 [ERR] http: Request GET /v1/acl/role/name/, error: Bad request: Missing role Name from=127.0.0.1:52924
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.884252 [DEBUG] http: Request GET /v1/acl/role/name/ (734.353µs) from=127.0.0.1:52924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.890778 [ERR] http: Request PUT /v1/acl/role/name/, error: method PUT not allowed from=127.0.0.1:52926
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.891478 [DEBUG] http: Request PUT /v1/acl/role/name/ (703.686µs) from=127.0.0.1:52926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.895189 [ERR] http: Request POST /v1/acl/role/name/, error: method POST not allowed from=127.0.0.1:52928
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.895777 [DEBUG] http: Request POST /v1/acl/role/name/ (589.683µs) from=127.0.0.1:52928
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.900329 [ERR] http: Request DELETE /v1/acl/role/name/, error: method DELETE not allowed from=127.0.0.1:52930
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.900936 [DEBUG] http: Request DELETE /v1/acl/role/name/ (608.349µs) from=127.0.0.1:52930
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.903830 [ERR] http: Request HEAD /v1/acl/role/name/, error: method HEAD not allowed from=127.0.0.1:52932
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.903990 [DEBUG] http: Request HEAD /v1/acl/role/name/ (173.671µs) from=127.0.0.1:52932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.905782 [DEBUG] http: Request OPTIONS /v1/acl/role/name/ (71.335µs) from=127.0.0.1:52932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.909219 [DEBUG] http: Request GET /v1/agent/checks (1.471372ms) from=127.0.0.1:52932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.912427 [ERR] http: Request PUT /v1/agent/checks, error: method PUT not allowed from=127.0.0.1:52934
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.913097 [DEBUG] http: Request PUT /v1/agent/checks (675.685µs) from=127.0.0.1:52934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.919951 [ERR] http: Request POST /v1/agent/checks, error: method POST not allowed from=127.0.0.1:52936
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.920610 [DEBUG] http: Request POST /v1/agent/checks (656.017µs) from=127.0.0.1:52936
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.930559 [ERR] http: Request DELETE /v1/agent/checks, error: method DELETE not allowed from=127.0.0.1:52938
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.931133 [DEBUG] http: Request DELETE /v1/agent/checks (584.015µs) from=127.0.0.1:52938
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.934357 [ERR] http: Request HEAD /v1/agent/checks, error: method HEAD not allowed from=127.0.0.1:52940
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.934544 [DEBUG] http: Request HEAD /v1/agent/checks (220.672µs) from=127.0.0.1:52940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.936054 [DEBUG] http: Request OPTIONS /v1/agent/checks (16µs) from=127.0.0.1:52940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.937556 [ERR] http: Request GET /v1/agent/check/fail/, error: method GET not allowed from=127.0.0.1:52940
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.938183 [DEBUG] http: Request GET /v1/agent/check/fail/ (621.683µs) from=127.0.0.1:52940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.941756 [ERR] http: Request PUT /v1/agent/check/fail/, error: Unknown check "" from=127.0.0.1:52942
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.942356 [DEBUG] http: Request PUT /v1/agent/check/fail/ (799.354µs) from=127.0.0.1:52942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.945560 [ERR] http: Request POST /v1/agent/check/fail/, error: method POST not allowed from=127.0.0.1:52944
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.946373 [DEBUG] http: Request POST /v1/agent/check/fail/ (805.021µs) from=127.0.0.1:52944
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.950670 [ERR] http: Request DELETE /v1/agent/check/fail/, error: method DELETE not allowed from=127.0.0.1:52946
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.951459 [DEBUG] http: Request DELETE /v1/agent/check/fail/ (783.354µs) from=127.0.0.1:52946
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.954573 [ERR] http: Request HEAD /v1/agent/check/fail/, error: method HEAD not allowed from=127.0.0.1:52948
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.954852 [DEBUG] http: Request HEAD /v1/agent/check/fail/ (292.341µs) from=127.0.0.1:52948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.959639 [DEBUG] http: Request OPTIONS /v1/agent/check/fail/ (15.667µs) from=127.0.0.1:52948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.963106 [DEBUG] http: Request GET /v1/connect/intentions (1.554042ms) from=127.0.0.1:52948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.966532 [ERR] http: Request PUT /v1/connect/intentions, error: method PUT not allowed from=127.0.0.1:52950
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.967466 [DEBUG] http: Request PUT /v1/connect/intentions (912.025µs) from=127.0.0.1:52950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.971050 [ERR] http: Request POST /v1/connect/intentions, error: Failed to decode request body: EOF from=127.0.0.1:52952
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.971948 [DEBUG] http: Request POST /v1/connect/intentions (909.357µs) from=127.0.0.1:52952
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.975176 [ERR] http: Request DELETE /v1/connect/intentions, error: method DELETE not allowed from=127.0.0.1:52954
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.975932 [DEBUG] http: Request DELETE /v1/connect/intentions (739.687µs) from=127.0.0.1:52954
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.980709 [ERR] http: Request HEAD /v1/connect/intentions, error: method HEAD not allowed from=127.0.0.1:52956
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.980872 [DEBUG] http: Request HEAD /v1/connect/intentions (187.005µs) from=127.0.0.1:52956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.982894 [DEBUG] http: Request OPTIONS /v1/connect/intentions (13.667µs) from=127.0.0.1:52956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.985443 [ERR] http: Request GET /v1/acl/binding-rules, error: Permission denied from=127.0.0.1:52956
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:03.986180 [DEBUG] http: Request GET /v1/acl/binding-rules (1.205699ms) from=127.0.0.1:52956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.000693 [ERR] http: Request PUT /v1/acl/binding-rules, error: method PUT not allowed from=127.0.0.1:52958
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.001277 [DEBUG] http: Request PUT /v1/acl/binding-rules (585.682µs) from=127.0.0.1:52958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.004904 [ERR] http: Request POST /v1/acl/binding-rules, error: method POST not allowed from=127.0.0.1:52960
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.005439 [DEBUG] http: Request POST /v1/acl/binding-rules (545.347µs) from=127.0.0.1:52960
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.009060 [ERR] http: Request DELETE /v1/acl/binding-rules, error: method DELETE not allowed from=127.0.0.1:52962
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.009868 [DEBUG] http: Request DELETE /v1/acl/binding-rules (780.354µs) from=127.0.0.1:52962
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.012843 [ERR] http: Request HEAD /v1/acl/binding-rules, error: method HEAD not allowed from=127.0.0.1:52964
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.013024 [DEBUG] http: Request HEAD /v1/acl/binding-rules (215.339µs) from=127.0.0.1:52964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.014837 [DEBUG] http: Request OPTIONS /v1/acl/binding-rules (15µs) from=127.0.0.1:52964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.016647 [ERR] http: Request GET /v1/acl/auth-method, error: method GET not allowed from=127.0.0.1:52964
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.017148 [DEBUG] http: Request GET /v1/acl/auth-method (502.013µs) from=127.0.0.1:52964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.021283 [ERR] http: Request PUT /v1/acl/auth-method, error: Bad request: AuthMethod decoding failed: EOF from=127.0.0.1:52966
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.021988 [DEBUG] http: Request PUT /v1/acl/auth-method (739.687µs) from=127.0.0.1:52966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.025912 [ERR] http: Request POST /v1/acl/auth-method, error: method POST not allowed from=127.0.0.1:52968
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.026687 [DEBUG] http: Request POST /v1/acl/auth-method (685.352µs) from=127.0.0.1:52968
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.029799 [ERR] http: Request DELETE /v1/acl/auth-method, error: method DELETE not allowed from=127.0.0.1:52970
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.030584 [DEBUG] http: Request DELETE /v1/acl/auth-method (775.353µs) from=127.0.0.1:52970
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.042009 [ERR] http: Request HEAD /v1/acl/auth-method, error: method HEAD not allowed from=127.0.0.1:52972
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.042158 [DEBUG] http: Request HEAD /v1/acl/auth-method (165.671µs) from=127.0.0.1:52972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.043667 [DEBUG] http: Request OPTIONS /v1/acl/auth-method (17.333µs) from=127.0.0.1:52972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.045422 [ERR] http: Request GET /v1/agent/metrics, error: Permission denied from=127.0.0.1:52972
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.046248 [DEBUG] http: Request GET /v1/agent/metrics (949.358µs) from=127.0.0.1:52972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.049481 [ERR] http: Request PUT /v1/agent/metrics, error: method PUT not allowed from=127.0.0.1:52974
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.050209 [DEBUG] http: Request PUT /v1/agent/metrics (795.021µs) from=127.0.0.1:52974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.053404 [ERR] http: Request POST /v1/agent/metrics, error: method POST not allowed from=127.0.0.1:52976
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.054145 [DEBUG] http: Request POST /v1/agent/metrics (741.02µs) from=127.0.0.1:52976
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.057834 [ERR] http: Request DELETE /v1/agent/metrics, error: method DELETE not allowed from=127.0.0.1:52978
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.058491 [DEBUG] http: Request DELETE /v1/agent/metrics (659.017µs) from=127.0.0.1:52978
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.061571 [ERR] http: Request HEAD /v1/agent/metrics, error: method HEAD not allowed from=127.0.0.1:52980
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.061803 [DEBUG] http: Request HEAD /v1/agent/metrics (258.673µs) from=127.0.0.1:52980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.063503 [DEBUG] http: Request OPTIONS /v1/agent/metrics (18.667µs) from=127.0.0.1:52980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.065635 [ERR] http: Request GET /v1/connect/intentions/, error: Bad request: failed intention lookup: index error: UUID must be 36 characters from=127.0.0.1:52980
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.066409 [DEBUG] http: Request GET /v1/connect/intentions/ (1.188698ms) from=127.0.0.1:52980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.070196 [DEBUG] http: Request PUT /v1/connect/intentions/ (627.017µs) from=127.0.0.1:52982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.074364 [ERR] http: Request POST /v1/connect/intentions/, error: method POST not allowed from=127.0.0.1:52984
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.075220 [DEBUG] http: Request POST /v1/connect/intentions/ (852.689µs) from=127.0.0.1:52984
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.080345 [ERR] http: Request DELETE /v1/connect/intentions/, error: Intention lookup failed: failed intention lookup: index error: UUID must be 36 characters from=127.0.0.1:52986
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.081154 [DEBUG] http: Request DELETE /v1/connect/intentions/ (1.323035ms) from=127.0.0.1:52986
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.084756 [ERR] http: Request HEAD /v1/connect/intentions/, error: method HEAD not allowed from=127.0.0.1:52988
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.084936 [DEBUG] http: Request HEAD /v1/connect/intentions/ (206.339µs) from=127.0.0.1:52988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.087012 [DEBUG] http: Request OPTIONS /v1/connect/intentions/ (19µs) from=127.0.0.1:52988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.094493 [ERR] http: Request GET /v1/operator/autopilot/configuration, error: Permission denied from=127.0.0.1:52988
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.095244 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (1.326702ms) from=127.0.0.1:52988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.099562 [DEBUG] http: Request PUT /v1/operator/autopilot/configuration (683.018µs) from=127.0.0.1:52990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.104190 [ERR] http: Request POST /v1/operator/autopilot/configuration, error: method POST not allowed from=127.0.0.1:52992
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.104994 [DEBUG] http: Request POST /v1/operator/autopilot/configuration (800.022µs) from=127.0.0.1:52992
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.109335 [ERR] http: Request DELETE /v1/operator/autopilot/configuration, error: method DELETE not allowed from=127.0.0.1:52994
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.110034 [DEBUG] http: Request DELETE /v1/operator/autopilot/configuration (704.352µs) from=127.0.0.1:52994
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.113019 [ERR] http: Request HEAD /v1/operator/autopilot/configuration, error: method HEAD not allowed from=127.0.0.1:52996
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.113174 [DEBUG] http: Request HEAD /v1/operator/autopilot/configuration (169.004µs) from=127.0.0.1:52996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.114738 [DEBUG] http: Request OPTIONS /v1/operator/autopilot/configuration (15.334µs) from=127.0.0.1:52996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.116362 [ERR] http: Request GET /v1/connect/intentions/check, error: required query parameter 'source' not set from=127.0.0.1:52996
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.116867 [DEBUG] http: Request GET /v1/connect/intentions/check (540.014µs) from=127.0.0.1:52996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.119777 [ERR] http: Request PUT /v1/connect/intentions/check, error: method PUT not allowed from=127.0.0.1:52998
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.120319 [DEBUG] http: Request PUT /v1/connect/intentions/check (553.348µs) from=127.0.0.1:52998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.123710 [ERR] http: Request POST /v1/connect/intentions/check, error: method POST not allowed from=127.0.0.1:53000
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.124485 [DEBUG] http: Request POST /v1/connect/intentions/check (694.018µs) from=127.0.0.1:53000
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.127570 [ERR] http: Request DELETE /v1/connect/intentions/check, error: method DELETE not allowed from=127.0.0.1:53002
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.128344 [DEBUG] http: Request DELETE /v1/connect/intentions/check (767.687µs) from=127.0.0.1:53002
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.131311 [ERR] http: Request HEAD /v1/connect/intentions/check, error: method HEAD not allowed from=127.0.0.1:53004
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.131463 [DEBUG] http: Request HEAD /v1/connect/intentions/check (179.338µs) from=127.0.0.1:53004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.133359 [DEBUG] http: Request OPTIONS /v1/connect/intentions/check (17µs) from=127.0.0.1:53004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.134959 [ERR] http: Request GET /v1/txn, error: method GET not allowed from=127.0.0.1:53004
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.135524 [DEBUG] http: Request GET /v1/txn (567.681µs) from=127.0.0.1:53004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.139039 [DEBUG] http: Request PUT /v1/txn (528.348µs) from=127.0.0.1:53006
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.142370 [ERR] http: Request POST /v1/txn, error: method POST not allowed from=127.0.0.1:53008
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.142946 [DEBUG] http: Request POST /v1/txn (586.682µs) from=127.0.0.1:53008
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.150662 [ERR] http: Request DELETE /v1/txn, error: method DELETE not allowed from=127.0.0.1:53010
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.151282 [DEBUG] http: Request DELETE /v1/txn (623.35µs) from=127.0.0.1:53010
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.154451 [ERR] http: Request HEAD /v1/txn, error: method HEAD not allowed from=127.0.0.1:53012
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.154650 [DEBUG] http: Request HEAD /v1/txn (303.341µs) from=127.0.0.1:53012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.156165 [DEBUG] http: Request OPTIONS /v1/txn (12.667µs) from=127.0.0.1:53012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.157936 [ERR] http: Request GET /v1/acl/role, error: method GET not allowed from=127.0.0.1:53012
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.158456 [DEBUG] http: Request GET /v1/acl/role (526.014µs) from=127.0.0.1:53012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.162200 [ERR] http: Request PUT /v1/acl/role, error: Bad request: Role decoding failed: EOF from=127.0.0.1:53014
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.162817 [DEBUG] http: Request PUT /v1/acl/role (662.684µs) from=127.0.0.1:53014
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.165996 [ERR] http: Request POST /v1/acl/role, error: method POST not allowed from=127.0.0.1:53016
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.166512 [DEBUG] http: Request POST /v1/acl/role (523.347µs) from=127.0.0.1:53016
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.169221 [ERR] http: Request DELETE /v1/acl/role, error: method DELETE not allowed from=127.0.0.1:53018
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.169954 [DEBUG] http: Request DELETE /v1/acl/role (731.019µs) from=127.0.0.1:53018
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.172817 [ERR] http: Request HEAD /v1/acl/role, error: method HEAD not allowed from=127.0.0.1:53020
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.173072 [DEBUG] http: Request HEAD /v1/acl/role (329.675µs) from=127.0.0.1:53020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.174650 [DEBUG] http: Request OPTIONS /v1/acl/role (16.668µs) from=127.0.0.1:53020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.176372 [ERR] http: Request GET /v1/agent/self, error: Permission denied from=127.0.0.1:53020
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.176886 [DEBUG] http: Request GET /v1/agent/self (654.351µs) from=127.0.0.1:53020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.179588 [ERR] http: Request PUT /v1/agent/self, error: method PUT not allowed from=127.0.0.1:53022
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.180186 [DEBUG] http: Request PUT /v1/agent/self (609.016µs) from=127.0.0.1:53022
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.183257 [ERR] http: Request POST /v1/agent/self, error: method POST not allowed from=127.0.0.1:53024
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.183939 [DEBUG] http: Request POST /v1/agent/self (686.352µs) from=127.0.0.1:53024
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.186795 [ERR] http: Request DELETE /v1/agent/self, error: method DELETE not allowed from=127.0.0.1:53026
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.187381 [DEBUG] http: Request DELETE /v1/agent/self (575.015µs) from=127.0.0.1:53026
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.190284 [ERR] http: Request HEAD /v1/agent/self, error: method HEAD not allowed from=127.0.0.1:53028
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.190434 [DEBUG] http: Request HEAD /v1/agent/self (168.338µs) from=127.0.0.1:53028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.192020 [DEBUG] http: Request OPTIONS /v1/agent/self (13µs) from=127.0.0.1:53028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.195175 [DEBUG] http: Request GET /v1/config/ (1.514707ms) from=127.0.0.1:53028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.198114 [ERR] http: Request PUT /v1/config/, error: method PUT not allowed from=127.0.0.1:53030
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.198723 [DEBUG] http: Request PUT /v1/config/ (620.683µs) from=127.0.0.1:53030
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.201577 [ERR] http: Request POST /v1/config/, error: method POST not allowed from=127.0.0.1:53032
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.202134 [DEBUG] http: Request POST /v1/config/ (566.015µs) from=127.0.0.1:53032
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.205466 [DEBUG] http: Request DELETE /v1/config/ (547.681µs) from=127.0.0.1:53034
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.208554 [ERR] http: Request HEAD /v1/config/, error: method HEAD not allowed from=127.0.0.1:53036
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.208694 [DEBUG] http: Request HEAD /v1/config/ (161.338µs) from=127.0.0.1:53036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.210140 [DEBUG] http: Request OPTIONS /v1/config/ (16.333µs) from=127.0.0.1:53036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.212082 [ERR] http: Request GET /v1/agent/force-leave/, error: method GET not allowed from=127.0.0.1:53036
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.212720 [DEBUG] http: Request GET /v1/agent/force-leave/ (641.018µs) from=127.0.0.1:53036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.216131 [ERR] http: Request PUT /v1/agent/force-leave/, error: Permission denied from=127.0.0.1:53038
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.216649 [DEBUG] http: Request PUT /v1/agent/force-leave/ (646.683µs) from=127.0.0.1:53038
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.219748 [ERR] http: Request POST /v1/agent/force-leave/, error: method POST not allowed from=127.0.0.1:53040
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.220436 [DEBUG] http: Request POST /v1/agent/force-leave/ (689.685µs) from=127.0.0.1:53040
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.223634 [ERR] http: Request DELETE /v1/agent/force-leave/, error: method DELETE not allowed from=127.0.0.1:53042
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.224267 [DEBUG] http: Request DELETE /v1/agent/force-leave/ (635.683µs) from=127.0.0.1:53042
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.227558 [ERR] http: Request HEAD /v1/agent/force-leave/, error: method HEAD not allowed from=127.0.0.1:53044
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.227734 [DEBUG] http: Request HEAD /v1/agent/force-leave/ (203.672µs) from=127.0.0.1:53044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.229335 [DEBUG] http: Request OPTIONS /v1/agent/force-leave/ (17.334µs) from=127.0.0.1:53044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.231115 [ERR] http: Request GET /v1/agent/check/warn/, error: method GET not allowed from=127.0.0.1:53044
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.231674 [DEBUG] http: Request GET /v1/agent/check/warn/ (556.681µs) from=127.0.0.1:53044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.234949 [ERR] http: Request PUT /v1/agent/check/warn/, error: Unknown check "" from=127.0.0.1:53046
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.235513 [DEBUG] http: Request PUT /v1/agent/check/warn/ (784.688µs) from=127.0.0.1:53046
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.238343 [ERR] http: Request POST /v1/agent/check/warn/, error: method POST not allowed from=127.0.0.1:53048
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.238940 [DEBUG] http: Request POST /v1/agent/check/warn/ (602.016µs) from=127.0.0.1:53048
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.242020 [ERR] http: Request DELETE /v1/agent/check/warn/, error: method DELETE not allowed from=127.0.0.1:53050
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.242710 [DEBUG] http: Request DELETE /v1/agent/check/warn/ (642.351µs) from=127.0.0.1:53050
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.246233 [ERR] http: Request HEAD /v1/agent/check/warn/, error: method HEAD not allowed from=127.0.0.1:53052
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.246392 [DEBUG] http: Request HEAD /v1/agent/check/warn/ (184.672µs) from=127.0.0.1:53052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.248231 [DEBUG] http: Request OPTIONS /v1/agent/check/warn/ (16.001µs) from=127.0.0.1:53052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.250376 [DEBUG] http: Request GET /v1/health/checks/ (572.015µs) from=127.0.0.1:53052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.253222 [ERR] http: Request PUT /v1/health/checks/, error: method PUT not allowed from=127.0.0.1:53054
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.253904 [DEBUG] http: Request PUT /v1/health/checks/ (687.351µs) from=127.0.0.1:53054
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.257126 [ERR] http: Request POST /v1/health/checks/, error: method POST not allowed from=127.0.0.1:53056
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.257731 [DEBUG] http: Request POST /v1/health/checks/ (608.683µs) from=127.0.0.1:53056
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.260727 [ERR] http: Request DELETE /v1/health/checks/, error: method DELETE not allowed from=127.0.0.1:53058
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.261246 [DEBUG] http: Request DELETE /v1/health/checks/ (521.014µs) from=127.0.0.1:53058
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.264215 [ERR] http: Request HEAD /v1/health/checks/, error: method HEAD not allowed from=127.0.0.1:53060
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.264352 [DEBUG] http: Request HEAD /v1/health/checks/ (151.004µs) from=127.0.0.1:53060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.266007 [DEBUG] http: Request OPTIONS /v1/health/checks/ (16.334µs) from=127.0.0.1:53060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.281138 [DEBUG] consul: dropping node "Node a465168d-d9e1-e400-3047-69dddd922f5b" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.283772 [DEBUG] http: Request GET /v1/internal/ui/nodes (3.222085ms) from=127.0.0.1:53060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.286626 [ERR] http: Request PUT /v1/internal/ui/nodes, error: method PUT not allowed from=127.0.0.1:53062
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.287240 [DEBUG] http: Request PUT /v1/internal/ui/nodes (616.016µs) from=127.0.0.1:53062
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.290113 [ERR] http: Request POST /v1/internal/ui/nodes, error: method POST not allowed from=127.0.0.1:53064
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.290758 [DEBUG] http: Request POST /v1/internal/ui/nodes (651.017µs) from=127.0.0.1:53064
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.293547 [ERR] http: Request DELETE /v1/internal/ui/nodes, error: method DELETE not allowed from=127.0.0.1:53066
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.294077 [DEBUG] http: Request DELETE /v1/internal/ui/nodes (530.348µs) from=127.0.0.1:53066
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.297159 [ERR] http: Request HEAD /v1/internal/ui/nodes, error: method HEAD not allowed from=127.0.0.1:53068
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.297313 [DEBUG] http: Request HEAD /v1/internal/ui/nodes (181.672µs) from=127.0.0.1:53068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.298646 [DEBUG] http: Request OPTIONS /v1/internal/ui/nodes (13.334µs) from=127.0.0.1:53068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.299935 [ERR] http: Request GET /v1/catalog/register, error: method GET not allowed from=127.0.0.1:53068
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.300399 [DEBUG] http: Request GET /v1/catalog/register (467.012µs) from=127.0.0.1:53068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.303457 [DEBUG] http: Request PUT /v1/catalog/register (413.677µs) from=127.0.0.1:53070
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.306515 [ERR] http: Request POST /v1/catalog/register, error: method POST not allowed from=127.0.0.1:53072
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.307150 [DEBUG] http: Request POST /v1/catalog/register (638.684µs) from=127.0.0.1:53072
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.309964 [ERR] http: Request DELETE /v1/catalog/register, error: method DELETE not allowed from=127.0.0.1:53074
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.310459 [DEBUG] http: Request DELETE /v1/catalog/register (499.68µs) from=127.0.0.1:53074
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.313898 [ERR] http: Request HEAD /v1/catalog/register, error: method HEAD not allowed from=127.0.0.1:53076
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.314054 [DEBUG] http: Request HEAD /v1/catalog/register (153.004µs) from=127.0.0.1:53076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.315587 [DEBUG] http: Request OPTIONS /v1/catalog/register (16µs) from=127.0.0.1:53076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.318041 [DEBUG] http: Request GET /v1/connect/ca/roots (988.36µs) from=127.0.0.1:53076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.321786 [ERR] http: Request PUT /v1/connect/ca/roots, error: method PUT not allowed from=127.0.0.1:53078
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.322296 [DEBUG] http: Request PUT /v1/connect/ca/roots (515.68µs) from=127.0.0.1:53078
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.324974 [ERR] http: Request POST /v1/connect/ca/roots, error: method POST not allowed from=127.0.0.1:53080
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.325489 [DEBUG] http: Request POST /v1/connect/ca/roots (513.014µs) from=127.0.0.1:53080
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.328339 [ERR] http: Request DELETE /v1/connect/ca/roots, error: method DELETE not allowed from=127.0.0.1:53082
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.328844 [DEBUG] http: Request DELETE /v1/connect/ca/roots (507.347µs) from=127.0.0.1:53082
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.335828 [ERR] http: Request HEAD /v1/connect/ca/roots, error: method HEAD not allowed from=127.0.0.1:53084
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.336482 [DEBUG] http: Request HEAD /v1/connect/ca/roots (696.019µs) from=127.0.0.1:53084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.338488 [DEBUG] http: Request OPTIONS /v1/connect/ca/roots (16.333µs) from=127.0.0.1:53084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.340335 [ERR] http: Request GET /v1/agent/health/service/name/, error: Bad request: Missing service Name from=127.0.0.1:53084
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.341021 [DEBUG] http: Request GET /v1/agent/health/service/name/ (691.352µs) from=127.0.0.1:53084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.345070 [ERR] http: Request PUT /v1/agent/health/service/name/, error: method PUT not allowed from=127.0.0.1:53086
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.345848 [DEBUG] http: Request PUT /v1/agent/health/service/name/ (772.021µs) from=127.0.0.1:53086
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.350618 [ERR] http: Request POST /v1/agent/health/service/name/, error: method POST not allowed from=127.0.0.1:53088
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.351849 [DEBUG] http: Request POST /v1/agent/health/service/name/ (1.217032ms) from=127.0.0.1:53088
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.356357 [ERR] http: Request DELETE /v1/agent/health/service/name/, error: method DELETE not allowed from=127.0.0.1:53090
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.357225 [DEBUG] http: Request DELETE /v1/agent/health/service/name/ (861.69µs) from=127.0.0.1:53090
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.361864 [ERR] http: Request HEAD /v1/agent/health/service/name/, error: method HEAD not allowed from=127.0.0.1:53092
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.362126 [DEBUG] http: Request HEAD /v1/agent/health/service/name/ (282.007µs) from=127.0.0.1:53092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.363966 [DEBUG] http: Request OPTIONS /v1/agent/health/service/name/ (17.334µs) from=127.0.0.1:53092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.366902 [DEBUG] http: Request GET /v1/catalog/node/ (530.348µs) from=127.0.0.1:53092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.370723 [ERR] http: Request PUT /v1/catalog/node/, error: method PUT not allowed from=127.0.0.1:53094
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.371644 [DEBUG] http: Request PUT /v1/catalog/node/ (807.021µs) from=127.0.0.1:53094
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.374514 [ERR] http: Request POST /v1/catalog/node/, error: method POST not allowed from=127.0.0.1:53096
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.375274 [DEBUG] http: Request POST /v1/catalog/node/ (743.019µs) from=127.0.0.1:53096
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.378801 [ERR] http: Request DELETE /v1/catalog/node/, error: method DELETE not allowed from=127.0.0.1:53098
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.379837 [DEBUG] http: Request DELETE /v1/catalog/node/ (1.012026ms) from=127.0.0.1:53098
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.383422 [ERR] http: Request HEAD /v1/catalog/node/, error: method HEAD not allowed from=127.0.0.1:53100
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.383595 [DEBUG] http: Request HEAD /v1/catalog/node/ (213.339µs) from=127.0.0.1:53100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.385612 [DEBUG] http: Request OPTIONS /v1/catalog/node/ (71.002µs) from=127.0.0.1:53100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.388311 [ERR] http: Request GET /v1/acl/binding-rule/, error: Bad request: Missing binding rule ID from=127.0.0.1:53100
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.390263 [DEBUG] http: Request GET /v1/acl/binding-rule/ (1.959385ms) from=127.0.0.1:53100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.393858 [ERR] http: Request PUT /v1/acl/binding-rule/, error: Bad request: BindingRule decoding failed: EOF from=127.0.0.1:53102
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.394682 [DEBUG] http: Request PUT /v1/acl/binding-rule/ (850.356µs) from=127.0.0.1:53102
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.397619 [ERR] http: Request POST /v1/acl/binding-rule/, error: method POST not allowed from=127.0.0.1:53104
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.398161 [DEBUG] http: Request POST /v1/acl/binding-rule/ (545.681µs) from=127.0.0.1:53104
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.401180 [ERR] http: Request DELETE /v1/acl/binding-rule/, error: Bad request: Missing binding rule ID from=127.0.0.1:53106
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.401820 [DEBUG] http: Request DELETE /v1/acl/binding-rule/ (630.017µs) from=127.0.0.1:53106
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.404721 [ERR] http: Request HEAD /v1/acl/binding-rule/, error: method HEAD not allowed from=127.0.0.1:53108
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.404868 [DEBUG] http: Request HEAD /v1/acl/binding-rule/ (201.006µs) from=127.0.0.1:53108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.406530 [DEBUG] http: Request OPTIONS /v1/acl/binding-rule/ (14.334µs) from=127.0.0.1:53108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.408096 [ERR] http: Request GET /v1/acl/token, error: method GET not allowed from=127.0.0.1:53108
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.408673 [DEBUG] http: Request GET /v1/acl/token (577.348µs) from=127.0.0.1:53108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.411675 [ERR] http: Request PUT /v1/acl/token, error: Bad request: Token decoding failed: EOF from=127.0.0.1:53110
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.412305 [DEBUG] http: Request PUT /v1/acl/token (667.684µs) from=127.0.0.1:53110
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.418034 [ERR] http: Request POST /v1/acl/token, error: method POST not allowed from=127.0.0.1:53112
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.418645 [DEBUG] http: Request POST /v1/acl/token (618.017µs) from=127.0.0.1:53112
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.422134 [ERR] http: Request DELETE /v1/acl/token, error: method DELETE not allowed from=127.0.0.1:53114
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.422882 [DEBUG] http: Request DELETE /v1/acl/token (759.354µs) from=127.0.0.1:53114
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.427005 [ERR] http: Request HEAD /v1/acl/token, error: method HEAD not allowed from=127.0.0.1:53116
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.427292 [DEBUG] http: Request HEAD /v1/acl/token (302.342µs) from=127.0.0.1:53116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.429552 [DEBUG] http: Request OPTIONS /v1/acl/token (17.667µs) from=127.0.0.1:53116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.431502 [ERR] http: Request GET /v1/operator/raft/configuration, error: Permission denied from=127.0.0.1:53116
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.432448 [DEBUG] http: Request GET /v1/operator/raft/configuration (1.351036ms) from=127.0.0.1:53116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.435739 [ERR] http: Request PUT /v1/operator/raft/configuration, error: method PUT not allowed from=127.0.0.1:53118
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.436385 [DEBUG] http: Request PUT /v1/operator/raft/configuration (637.684µs) from=127.0.0.1:53118
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.439439 [ERR] http: Request POST /v1/operator/raft/configuration, error: method POST not allowed from=127.0.0.1:53120
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.440085 [DEBUG] http: Request POST /v1/operator/raft/configuration (708.019µs) from=127.0.0.1:53120
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.442805 [ERR] http: Request DELETE /v1/operator/raft/configuration, error: method DELETE not allowed from=127.0.0.1:53122
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.443442 [DEBUG] http: Request DELETE /v1/operator/raft/configuration (649.017µs) from=127.0.0.1:53122
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.446393 [ERR] http: Request HEAD /v1/operator/raft/configuration, error: method HEAD not allowed from=127.0.0.1:53124
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.446582 [DEBUG] http: Request HEAD /v1/operator/raft/configuration (212.006µs) from=127.0.0.1:53124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.448347 [DEBUG] http: Request OPTIONS /v1/operator/raft/configuration (16.667µs) from=127.0.0.1:53124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.451191 [DEBUG] http: Request GET /v1/session/list (1.206032ms) from=127.0.0.1:53124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.460223 [ERR] http: Request PUT /v1/session/list, error: method PUT not allowed from=127.0.0.1:53126
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.460879 [DEBUG] http: Request PUT /v1/session/list (667.685µs) from=127.0.0.1:53126
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.464341 [ERR] http: Request POST /v1/session/list, error: method POST not allowed from=127.0.0.1:53128
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.465076 [DEBUG] http: Request POST /v1/session/list (693.685µs) from=127.0.0.1:53128
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.468793 [ERR] http: Request DELETE /v1/session/list, error: method DELETE not allowed from=127.0.0.1:53130
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.469567 [DEBUG] http: Request DELETE /v1/session/list (760.687µs) from=127.0.0.1:53130
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.473168 [ERR] http: Request HEAD /v1/session/list, error: method HEAD not allowed from=127.0.0.1:53132
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.473443 [DEBUG] http: Request HEAD /v1/session/list (342.009µs) from=127.0.0.1:53132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.475784 [DEBUG] http: Request OPTIONS /v1/session/list (19µs) from=127.0.0.1:53132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.477914 [ERR] http: Request GET /v1/agent/service/maintenance/, error: method GET not allowed from=127.0.0.1:53132
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.478608 [DEBUG] http: Request GET /v1/agent/service/maintenance/ (694.018µs) from=127.0.0.1:53132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.483103 [DEBUG] http: Request PUT /v1/agent/service/maintenance/ (534.348µs) from=127.0.0.1:53134
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.486846 [ERR] http: Request POST /v1/agent/service/maintenance/, error: method POST not allowed from=127.0.0.1:53136
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.487665 [DEBUG] http: Request POST /v1/agent/service/maintenance/ (740.686µs) from=127.0.0.1:53136
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.494933 [ERR] http: Request DELETE /v1/agent/service/maintenance/, error: method DELETE not allowed from=127.0.0.1:53138
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.495678 [DEBUG] http: Request DELETE /v1/agent/service/maintenance/ (752.353µs) from=127.0.0.1:53138
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.499443 [ERR] http: Request HEAD /v1/agent/service/maintenance/, error: method HEAD not allowed from=127.0.0.1:53140
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.499704 [DEBUG] http: Request HEAD /v1/agent/service/maintenance/ (292.007µs) from=127.0.0.1:53140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.501659 [DEBUG] http: Request OPTIONS /v1/agent/service/maintenance/ (21.333µs) from=127.0.0.1:53140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.511252 [ERR] http: Request GET /v1/acl/logout, error: method GET not allowed from=127.0.0.1:53140
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.512236 [DEBUG] http: Request GET /v1/acl/logout (992.36µs) from=127.0.0.1:53140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.515754 [ERR] http: Request PUT /v1/acl/logout, error: method PUT not allowed from=127.0.0.1:53142
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.516317 [DEBUG] http: Request PUT /v1/acl/logout (627.35µs) from=127.0.0.1:53142
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.524653 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:53144
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.525282 [DEBUG] http: Request POST /v1/acl/logout (701.018µs) from=127.0.0.1:53144
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.528836 [ERR] http: Request DELETE /v1/acl/logout, error: method DELETE not allowed from=127.0.0.1:53146
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.529551 [DEBUG] http: Request DELETE /v1/acl/logout (596.016µs) from=127.0.0.1:53146
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.538958 [ERR] http: Request HEAD /v1/acl/logout, error: method HEAD not allowed from=127.0.0.1:53148
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.539117 [DEBUG] http: Request HEAD /v1/acl/logout (181.005µs) from=127.0.0.1:53148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.541002 [DEBUG] http: Request OPTIONS /v1/acl/logout (15.334µs) from=127.0.0.1:53148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.542456 [DEBUG] agent: dropping node "Node a465168d-d9e1-e400-3047-69dddd922f5b" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.543446 [DEBUG] http: Request GET /v1/agent/members (1.132697ms) from=127.0.0.1:53148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.547420 [ERR] http: Request PUT /v1/agent/members, error: method PUT not allowed from=127.0.0.1:53150
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.547915 [DEBUG] http: Request PUT /v1/agent/members (493.679µs) from=127.0.0.1:53150
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.550953 [ERR] http: Request POST /v1/agent/members, error: method POST not allowed from=127.0.0.1:53152
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.551431 [DEBUG] http: Request POST /v1/agent/members (476.346µs) from=127.0.0.1:53152
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.557161 [ERR] http: Request DELETE /v1/agent/members, error: method DELETE not allowed from=127.0.0.1:53154
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.557905 [DEBUG] http: Request DELETE /v1/agent/members (754.687µs) from=127.0.0.1:53154
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.562445 [ERR] http: Request HEAD /v1/agent/members, error: method HEAD not allowed from=127.0.0.1:53156
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.562594 [DEBUG] http: Request HEAD /v1/agent/members (168.005µs) from=127.0.0.1:53156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.564958 [DEBUG] http: Request OPTIONS /v1/agent/members (25.668µs) from=127.0.0.1:53156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.566454 [ERR] http: Request GET /v1/acl/rules/translate, error: method GET not allowed from=127.0.0.1:53156
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.567118 [DEBUG] http: Request GET /v1/acl/rules/translate (669.684µs) from=127.0.0.1:53156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.570471 [ERR] http: Request PUT /v1/acl/rules/translate, error: method PUT not allowed from=127.0.0.1:53158
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.571055 [DEBUG] http: Request PUT /v1/acl/rules/translate (594.016µs) from=127.0.0.1:53158
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.574162 [ERR] http: Request POST /v1/acl/rules/translate, error: Permission denied from=127.0.0.1:53160
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.574897 [DEBUG] http: Request POST /v1/acl/rules/translate (879.357µs) from=127.0.0.1:53160
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.582228 [ERR] http: Request DELETE /v1/acl/rules/translate, error: method DELETE not allowed from=127.0.0.1:53162
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.582915 [DEBUG] http: Request DELETE /v1/acl/rules/translate (695.686µs) from=127.0.0.1:53162
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.585930 [ERR] http: Request HEAD /v1/acl/rules/translate, error: method HEAD not allowed from=127.0.0.1:53164
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.586067 [DEBUG] http: Request HEAD /v1/acl/rules/translate (152.337µs) from=127.0.0.1:53164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.588682 [DEBUG] http: Request OPTIONS /v1/acl/rules/translate (14.667µs) from=127.0.0.1:53164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.591531 [DEBUG] http: Request GET /v1/catalog/datacenters (811.355µs) from=127.0.0.1:53164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.594540 [ERR] http: Request PUT /v1/catalog/datacenters, error: method PUT not allowed from=127.0.0.1:53166
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.595092 [DEBUG] http: Request PUT /v1/catalog/datacenters (558.015µs) from=127.0.0.1:53166
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.598038 [ERR] http: Request POST /v1/catalog/datacenters, error: method POST not allowed from=127.0.0.1:53168
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.598720 [DEBUG] http: Request POST /v1/catalog/datacenters (679.351µs) from=127.0.0.1:53168
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.601738 [ERR] http: Request DELETE /v1/catalog/datacenters, error: method DELETE not allowed from=127.0.0.1:53170
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.602402 [DEBUG] http: Request DELETE /v1/catalog/datacenters (664.351µs) from=127.0.0.1:53170
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.605901 [ERR] http: Request HEAD /v1/catalog/datacenters, error: method HEAD not allowed from=127.0.0.1:53172
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.606057 [DEBUG] http: Request HEAD /v1/catalog/datacenters (177.005µs) from=127.0.0.1:53172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.607467 [DEBUG] http: Request OPTIONS /v1/catalog/datacenters (18.334µs) from=127.0.0.1:53172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.610508 [DEBUG] http: Request GET /v1/health/node/ (860.023µs) from=127.0.0.1:53172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.614526 [ERR] http: Request PUT /v1/health/node/, error: method PUT not allowed from=127.0.0.1:53174
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.615274 [DEBUG] http: Request PUT /v1/health/node/ (758.02µs) from=127.0.0.1:53174
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.619158 [ERR] http: Request POST /v1/health/node/, error: method POST not allowed from=127.0.0.1:53176
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.620318 [DEBUG] http: Request POST /v1/health/node/ (1.154698ms) from=127.0.0.1:53176
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.624342 [ERR] http: Request DELETE /v1/health/node/, error: method DELETE not allowed from=127.0.0.1:53178
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.626178 [DEBUG] http: Request DELETE /v1/health/node/ (1.741046ms) from=127.0.0.1:53178
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.650568 [ERR] http: Request HEAD /v1/health/node/, error: method HEAD not allowed from=127.0.0.1:53180
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.650901 [DEBUG] http: Request HEAD /v1/health/node/ (346.01µs) from=127.0.0.1:53180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.652422 [DEBUG] http: Request OPTIONS /v1/health/node/ (17µs) from=127.0.0.1:53180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.654575 [DEBUG] http: Request GET /v1/session/info/ (653.684µs) from=127.0.0.1:53180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.657335 [ERR] http: Request PUT /v1/session/info/, error: method PUT not allowed from=127.0.0.1:53182
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.657879 [DEBUG] http: Request PUT /v1/session/info/ (545.014µs) from=127.0.0.1:53182
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.660687 [ERR] http: Request POST /v1/session/info/, error: method POST not allowed from=127.0.0.1:53184
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.661244 [DEBUG] http: Request POST /v1/session/info/ (563.682µs) from=127.0.0.1:53184
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.664262 [ERR] http: Request DELETE /v1/session/info/, error: method DELETE not allowed from=127.0.0.1:53186
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.665320 [DEBUG] http: Request DELETE /v1/session/info/ (1.048028ms) from=127.0.0.1:53186
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.669026 [ERR] http: Request HEAD /v1/session/info/, error: method HEAD not allowed from=127.0.0.1:53188
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.669167 [DEBUG] http: Request HEAD /v1/session/info/ (162.671µs) from=127.0.0.1:53188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.670714 [DEBUG] http: Request OPTIONS /v1/session/info/ (17.667µs) from=127.0.0.1:53188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.672936 [DEBUG] http: Request GET /v1/status/leader (684.018µs) from=127.0.0.1:53188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.676215 [ERR] http: Request PUT /v1/status/leader, error: method PUT not allowed from=127.0.0.1:53190
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.677146 [DEBUG] http: Request PUT /v1/status/leader (933.024µs) from=127.0.0.1:53190
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.680476 [ERR] http: Request POST /v1/status/leader, error: method POST not allowed from=127.0.0.1:53192
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.681192 [DEBUG] http: Request POST /v1/status/leader (704.352µs) from=127.0.0.1:53192
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.684192 [ERR] http: Request DELETE /v1/status/leader, error: method DELETE not allowed from=127.0.0.1:53194
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.684914 [DEBUG] http: Request DELETE /v1/status/leader (725.353µs) from=127.0.0.1:53194
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.687508 [ERR] http: Request HEAD /v1/status/leader, error: method HEAD not allowed from=127.0.0.1:53196
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.687643 [DEBUG] http: Request HEAD /v1/status/leader (159.337µs) from=127.0.0.1:53196
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.689246 [DEBUG] http: Request OPTIONS /v1/status/leader (15.667µs) from=127.0.0.1:53196
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.689777 [INFO] agent: Requesting shutdown
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.689840 [INFO] consul: shutting down server
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.689887 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.795565 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.853943 [INFO] manager: shutting down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.854947 [INFO] agent: consul server down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.855027 [INFO] agent: shutdown complete
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.855087 [INFO] agent: Stopping DNS server 127.0.0.1:17633 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.855230 [INFO] agent: Stopping DNS server 127.0.0.1:17633 (udp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.855395 [INFO] agent: Stopping HTTP server 127.0.0.1:17634 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.855890 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/30 18:52:04.855989 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_MethodNotAllowed_OSS (10.73s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/execute (0.03s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/explain (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/explain (0.04s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/explain (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/explain (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/node/ (0.03s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/id/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/id/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/id/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/register (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/service/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/peer (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/peer (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/peer (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/register (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/match (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/match (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/update (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/update/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/fire/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/fire/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/info/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/service/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/destroy/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-methods (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/health (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/services (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/state/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/state/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/snapshot (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/leaf/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/leaf/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policies (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policies (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/renew/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/token/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/leave (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/maintenance (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/maintenance (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/deregister/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/roles (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/name/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/checks (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/checks (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rules (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/configuration (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/txn (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/name/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/name/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/maintenance/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/logout (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/logout (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/logout (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/logout (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/members (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/node/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/status/leader (0.00s)
=== RUN   TestHTTPAPI_OptionMethod_OSS
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:04.913770 [WARN] agent: Node name "Node abcb7d5d-6013-c16b-8191-6a6ed5206bad" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:04.914200 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:04.916525 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:abcb7d5d-6013-c16b-8191-6a6ed5206bad Address:127.0.0.1:17644}]
2019/12/30 18:52:05 [INFO]  raft: Node at 127.0.0.1:17644 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.699582 [INFO] serf: EventMemberJoin: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad.dc1 127.0.0.1
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.702952 [INFO] serf: EventMemberJoin: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad 127.0.0.1
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.703749 [INFO] consul: Handled member-join event for server "Node abcb7d5d-6013-c16b-8191-6a6ed5206bad.dc1" in area "wan"
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.704049 [INFO] consul: Adding LAN server Node abcb7d5d-6013-c16b-8191-6a6ed5206bad (Addr: tcp/127.0.0.1:17644) (DC: dc1)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.704352 [INFO] agent: Started DNS server 127.0.0.1:17639 (udp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.704705 [INFO] agent: Started DNS server 127.0.0.1:17639 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.707070 [INFO] agent: Started HTTP server on 127.0.0.1:17640 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:05.707183 [INFO] agent: started state syncer
2019/12/30 18:52:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:05 [INFO]  raft: Node at 127.0.0.1:17644 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:06 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:06 [INFO]  raft: Node at 127.0.0.1:17644 [Leader] entering Leader state
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.247687 [INFO] consul: cluster leadership acquired
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.248203 [INFO] consul: New leader elected: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.398801 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.455684 [INFO] acl: initializing acls
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.712761 [INFO] acl: initializing acls
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.713421 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.863475 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.864135 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.864236 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.865055 [INFO] serf: EventMemberUpdate: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:06.865742 [INFO] serf: EventMemberUpdate: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad.dc1
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:07.071800 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:07.072668 [INFO] serf: EventMemberUpdate: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:07.073364 [INFO] serf: EventMemberUpdate: Node abcb7d5d-6013-c16b-8191-6a6ed5206bad.dc1
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:07.946360 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:07.946854 [DEBUG] consul: Skipping self join check for "Node abcb7d5d-6013-c16b-8191-6a6ed5206bad" since the cluster is too small
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:07.946947 [INFO] consul: member 'Node abcb7d5d-6013-c16b-8191-6a6ed5206bad' joined, marking health alive
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.173128 [DEBUG] consul: Skipping self join check for "Node abcb7d5d-6013-c16b-8191-6a6ed5206bad" since the cluster is too small
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.173644 [DEBUG] consul: Skipping self join check for "Node abcb7d5d-6013-c16b-8191-6a6ed5206bad" since the cluster is too small
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.198412 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/query (17.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.199767 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/query/ (850.023µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/execute
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.200455 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/query/xxx/execute (99.003µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/explain
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.201039 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/query/xxx/explain (103.336µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/logout
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.201563 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/logout (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/members
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.202031 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/members (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/maintenance/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.202479 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/service/maintenance/ (15.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.202968 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/health/node/ (16.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/info/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.203428 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/session/info/ (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.203926 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/rules/translate (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/datacenters
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.204431 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/datacenters (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/status/leader
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.204908 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/status/leader (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.205379 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/auth-method/ (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/clone/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.205901 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/clone/ (16µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/id/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.206362 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/health/service/id/ (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/pass/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.206860 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/check/pass/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.207311 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/coordinate/node/ (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/destroy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.207811 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/session/destroy/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.208256 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/config (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/service/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.208709 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/health/service/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.209143 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/service/ (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/authorize
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.209651 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/connect/authorize (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/register
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.210135 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/service/register (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/peer
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.210640 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/operator/raft/peer (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.211086 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/session/node/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/datacenters
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.211615 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/coordinate/datacenters (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.212094 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/internal/ui/node/ (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/replication
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.212544 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/replication (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/join/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.212989 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/join/ (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/register
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.213446 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/check/register (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/match
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.213924 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/connect/intentions/match (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/update
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.214467 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/coordinate/update (69.335µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/create
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.214949 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/session/create (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/update
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.215404 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/update (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/update/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.215936 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/check/update/ (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/fire/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.216401 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/event/fire/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/list
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.216848 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/event/list (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/info/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.217294 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/info/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/tokens
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.217752 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/tokens (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/services
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.218320 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/services (16.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.218791 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/nodes (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/self
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.219259 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/token/self (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/connect/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.219801 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/connect/ (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/service/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.220257 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/service/ (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/kv/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.220743 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/kv/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/destroy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.221186 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/destroy/ (13.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-methods
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.221733 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/auth-methods (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/services
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.222241 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/services (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.222700 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/policy (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/keyring
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.223142 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/operator/keyring (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/health
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.223608 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/operator/autopilot/health (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/deregister
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.224069 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/deregister (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.224604 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/rules/translate/ (14.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/host
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.225055 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/host (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/deregister/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.225525 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/service/deregister/ (14.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.226125 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/connect/ca/configuration (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.226602 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/coordinate/nodes (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/services
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.227054 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/internal/ui/services (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/proxy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.227518 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/connect/proxy/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/state/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.227973 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/health/state/ (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/connect/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.228418 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/health/connect/ (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.228867 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/policy/ (15.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.229313 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/token/ (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/snapshot
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.229892 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/snapshot (14.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/bootstrap
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.230371 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/bootstrap (15.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policies
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.230836 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/policies (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.231282 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/binding-rule (14.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/leaf/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.231732 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/connect/ca/leaf/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/list
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.232351 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/list (15.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/token/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.232809 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/token/ (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/leave
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.233252 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/leave (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/roots
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.233735 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/connect/ca/roots (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/renew/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.234200 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/session/renew/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/maintenance
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.234724 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/maintenance (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/login
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.235178 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/login (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/deregister/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.235646 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/check/deregister/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/create
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.236203 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/create (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/roles
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.236650 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/roles (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/name/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.237089 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/role/name/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.237554 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/role/ (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/fail/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.238012 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/check/fail/ (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.238470 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/connect/intentions (14.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rules
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.238918 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/binding-rules (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.239357 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/auth-method (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/checks
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.239902 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/checks (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/metrics
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.240355 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/metrics (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.240929 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/connect/intentions/ (17µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.241433 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/operator/autopilot/configuration (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.241974 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/role (85.336µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/self
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.242441 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/self (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.242946 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/config/ (21.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/check
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.243421 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/connect/intentions/check (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/txn
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.243870 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/txn (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/force-leave/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.244348 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/force-leave/ (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/warn/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.244912 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/check/warn/ (16.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/register
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.245396 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/register (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/roots
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.245863 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/connect/ca/roots (16.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/checks/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.246411 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/health/checks/ (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.246870 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/internal/ui/nodes (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.247320 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/catalog/node/ (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.247760 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/binding-rule/ (14.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.248198 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/acl/token (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/name/
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.248665 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/agent/health/service/name/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.249184 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/operator/raft/configuration (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/list
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.249721 [DEBUG] http: Request OPTIONS http://127.0.0.1:17640/v1/session/list (15.001µs) from=
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.249874 [INFO] agent: Requesting shutdown
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.249940 [INFO] consul: shutting down server
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.249988 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.304019 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.354035 [INFO] manager: shutting down
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.354472 [INFO] agent: consul server down
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.354534 [INFO] agent: shutdown complete
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.354591 [INFO] agent: Stopping DNS server 127.0.0.1:17639 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.354746 [INFO] agent: Stopping DNS server 127.0.0.1:17639 (udp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.354946 [INFO] agent: Stopping HTTP server 127.0.0.1:17640 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.355190 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_OptionMethod_OSS - 2019/12/30 18:52:08.355268 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_OptionMethod_OSS (3.50s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/list (0.00s)
=== RUN   TestHTTPAPI_AllowedNets_OSS
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:08.417693 [WARN] agent: Node name "Node 97c853b4-2d02-870e-4ff2-02b984d1bc48" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:08.418188 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:08.420699 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:97c853b4-2d02-870e-4ff2-02b984d1bc48 Address:127.0.0.1:17650}]
2019/12/30 18:52:09 [INFO]  raft: Node at 127.0.0.1:17650 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.249990 [INFO] serf: EventMemberJoin: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48.dc1 127.0.0.1
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.253203 [INFO] serf: EventMemberJoin: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48 127.0.0.1
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.254035 [INFO] consul: Handled member-join event for server "Node 97c853b4-2d02-870e-4ff2-02b984d1bc48.dc1" in area "wan"
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.254318 [INFO] consul: Adding LAN server Node 97c853b4-2d02-870e-4ff2-02b984d1bc48 (Addr: tcp/127.0.0.1:17650) (DC: dc1)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.254812 [INFO] agent: Started DNS server 127.0.0.1:17645 (udp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.254886 [INFO] agent: Started DNS server 127.0.0.1:17645 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.258138 [INFO] agent: Started HTTP server on 127.0.0.1:17646 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.258239 [INFO] agent: started state syncer
2019/12/30 18:52:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:09 [INFO]  raft: Node at 127.0.0.1:17650 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:09 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:09 [INFO]  raft: Node at 127.0.0.1:17650 [Leader] entering Leader state
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.897603 [INFO] consul: cluster leadership acquired
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:09.898027 [INFO] consul: New leader elected: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:10.005749 [INFO] acl: initializing acls
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:10.111515 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:10.355098 [INFO] acl: initializing acls
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:10.355443 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:10.556346 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.114475 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.114604 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.114527 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.115439 [INFO] serf: EventMemberUpdate: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.116123 [INFO] serf: EventMemberUpdate: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48.dc1
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.116251 [INFO] serf: EventMemberUpdate: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:11.116906 [INFO] serf: EventMemberUpdate: Node 97c853b4-2d02-870e-4ff2-02b984d1bc48.dc1
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.148657 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.149102 [DEBUG] consul: Skipping self join check for "Node 97c853b4-2d02-870e-4ff2-02b984d1bc48" since the cluster is too small
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.149247 [INFO] consul: member 'Node 97c853b4-2d02-870e-4ff2-02b984d1bc48' joined, marking health alive
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.151685 [INFO] agent: Synced node info
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.151822 [DEBUG] agent: Node info in sync
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.363425 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.364824 [DEBUG] consul: Skipping self join check for "Node 97c853b4-2d02-870e-4ff2-02b984d1bc48" since the cluster is too small
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.365348 [DEBUG] consul: Skipping self join check for "Node 97c853b4-2d02-870e-4ff2-02b984d1bc48" since the cluster is too small
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/register
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.379702 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/check/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.379878 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/check/register (213.339µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/join/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.380654 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/join/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.380789 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/join/ (150.338µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/coordinate/update
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.381477 [ERR] http: Request PUT http://127.0.0.1:17646/v1/coordinate/update, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.381603 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/coordinate/update (139.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/create
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.382280 [ERR] http: Request PUT http://127.0.0.1:17646/v1/session/create, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.382406 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/session/create (139.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/update
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.386008 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/update, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.386141 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/update (141.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/update/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.387028 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/check/update/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.387163 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/check/update/ (141.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/event/fire/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.387871 [ERR] http: Request PUT http://127.0.0.1:17646/v1/event/fire/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.387964 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/event/fire/ (99.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/kv/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.388652 [ERR] http: Request PUT http://127.0.0.1:17646/v1/kv/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.388744 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/kv/ (97.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/kv/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.389504 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/kv/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.389612 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/kv/ (116.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/destroy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.390351 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/destroy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.390449 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/destroy/ (101.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.391133 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/policy, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.391226 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/policy (99.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.391888 [ERR] http: Request POST http://127.0.0.1:17646/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.391984 [DEBUG] http: Request POST http://127.0.0.1:17646/v1/operator/keyring (99.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.392492 [ERR] http: Request PUT http://127.0.0.1:17646/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.392584 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/operator/keyring (96.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.393166 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.393260 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/operator/keyring (94.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/deregister/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.393959 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/service/deregister/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.394078 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/service/deregister/ (123.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/deregister
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.394837 [ERR] http: Request PUT http://127.0.0.1:17646/v1/catalog/deregister, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.394934 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/catalog/deregister (101.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/ca/configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.396503 [ERR] http: Request PUT http://127.0.0.1:17646/v1/connect/ca/configuration, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.396636 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/connect/ca/configuration (141.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.397484 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/policy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.397725 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/policy/ (246.674µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/policy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.398571 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/acl/policy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.398680 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/acl/policy/ (115.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.399545 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.399657 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/token/ (121.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/token/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.400457 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/acl/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.400566 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/acl/token/ (115.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/snapshot
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.403638 [ERR] http: Request PUT http://127.0.0.1:17646/v1/snapshot, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.403764 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/snapshot (134.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/bootstrap
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.404370 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/bootstrap, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.404702 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/bootstrap (330.676µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.405330 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/binding-rule, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.405424 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/binding-rule (107.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/renew/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.405931 [ERR] http: Request PUT http://127.0.0.1:17646/v1/session/renew/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.406020 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/session/renew/ (93.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/token/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.406531 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.406616 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/token/ (87.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/leave
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.407066 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/leave, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.407150 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/leave (85.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/maintenance
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.407642 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/maintenance, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.407728 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/maintenance (89.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/login
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.417461 [ERR] http: Request POST http://127.0.0.1:17646/v1/acl/login, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.417609 [DEBUG] http: Request POST http://127.0.0.1:17646/v1/acl/login (159.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/deregister/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.418425 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/check/deregister/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.418529 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/check/deregister/ (116.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/create
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.419071 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/create, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.419164 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/create (99.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.419714 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/role/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.419806 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/role/ (98.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/role/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.420321 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/acl/role/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.420412 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/acl/role/ (97.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/fail/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.421068 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/check/fail/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.421178 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/check/fail/ (118.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/connect/intentions
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.422492 [ERR] http: Request POST http://127.0.0.1:17646/v1/connect/intentions, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.422595 [DEBUG] http: Request POST http://127.0.0.1:17646/v1/connect/intentions (110.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.423090 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/auth-method, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.423173 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/auth-method (87.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/intentions/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.423721 [ERR] http: Request PUT http://127.0.0.1:17646/v1/connect/intentions/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.423806 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/connect/intentions/ (89.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/connect/intentions/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.424257 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/connect/intentions/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.424336 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/connect/intentions/ (84.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/autopilot/configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.428932 [ERR] http: Request PUT http://127.0.0.1:17646/v1/operator/autopilot/configuration, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.429055 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/operator/autopilot/configuration (136.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.429666 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/role, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.429765 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/role (105.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/config/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.430257 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/config/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.430348 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/config/ (98.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/txn
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.431029 [ERR] http: Request PUT http://127.0.0.1:17646/v1/txn, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.431126 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/txn (101.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/force-leave/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.434027 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/force-leave/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.434160 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/force-leave/ (139.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/warn/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.435324 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/check/warn/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.435437 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/check/warn/ (119.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/register
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.436283 [ERR] http: Request PUT http://127.0.0.1:17646/v1/catalog/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.436385 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/catalog/register (110.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.439085 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/binding-rule/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.439194 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/binding-rule/ (115.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/binding-rule/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.440099 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/acl/binding-rule/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.440192 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/acl/binding-rule/ (99.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.442505 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/token, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.442612 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/token (114.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/logout
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.443327 [ERR] http: Request POST http://127.0.0.1:17646/v1/acl/logout, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.443443 [DEBUG] http: Request POST http://127.0.0.1:17646/v1/acl/logout (124.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/maintenance/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.444061 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/service/maintenance/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.444172 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/service/maintenance/ (90.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/rules/translate
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.444697 [ERR] http: Request POST http://127.0.0.1:17646/v1/acl/rules/translate, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.444779 [DEBUG] http: Request POST http://127.0.0.1:17646/v1/acl/rules/translate (84.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.445332 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/auth-method/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.445432 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/auth-method/ (105.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/auth-method/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.446065 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/acl/auth-method/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.446169 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/acl/auth-method/ (108.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/clone/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.446705 [ERR] http: Request PUT http://127.0.0.1:17646/v1/acl/clone/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.446791 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/acl/clone/ (92.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/pass/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.447642 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/check/pass/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.447738 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/check/pass/ (106.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/destroy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.448261 [ERR] http: Request PUT http://127.0.0.1:17646/v1/session/destroy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.448346 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/session/destroy/ (90.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/register
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.454674 [ERR] http: Request PUT http://127.0.0.1:17646/v1/agent/service/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.454789 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/agent/service/register (124.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/config
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.455789 [ERR] http: Request PUT http://127.0.0.1:17646/v1/config, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.456193 [DEBUG] http: Request PUT http://127.0.0.1:17646/v1/config (234.339µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/agent/connect/authorize
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.457169 [ERR] http: Request POST http://127.0.0.1:17646/v1/agent/connect/authorize, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.457544 [DEBUG] http: Request POST http://127.0.0.1:17646/v1/agent/connect/authorize (316.009µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/raft/peer
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.458712 [ERR] http: Request DELETE http://127.0.0.1:17646/v1/operator/raft/peer, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.459089 [DEBUG] http: Request DELETE http://127.0.0.1:17646/v1/operator/raft/peer (354.01µs) from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.459707 [INFO] agent: Requesting shutdown
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.460076 [INFO] consul: shutting down server
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.460488 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.562571 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.654465 [INFO] manager: shutting down
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.655284 [INFO] agent: consul server down
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.655346 [INFO] agent: shutdown complete
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.655403 [INFO] agent: Stopping DNS server 127.0.0.1:17645 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.655563 [INFO] agent: Stopping DNS server 127.0.0.1:17645 (udp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.655773 [INFO] agent: Stopping HTTP server 127.0.0.1:17646 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.656018 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_AllowedNets_OSS - 2019/12/30 18:52:13.656105 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_AllowedNets_OSS (5.30s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/maintenance (0.01s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/raft/peer (0.00s)
=== RUN   TestHTTPServer_UnixSocket
=== PAUSE TestHTTPServer_UnixSocket
=== RUN   TestHTTPServer_UnixSocket_FileExists
=== PAUSE TestHTTPServer_UnixSocket_FileExists
=== RUN   TestHTTPServer_H2
--- SKIP: TestHTTPServer_H2 (0.00s)
    http_test.go:131: DM-skipped
=== RUN   TestSetIndex
=== PAUSE TestSetIndex
=== RUN   TestSetKnownLeader
=== PAUSE TestSetKnownLeader
=== RUN   TestSetLastContact
=== PAUSE TestSetLastContact
=== RUN   TestSetMeta
=== PAUSE TestSetMeta
=== RUN   TestHTTPAPI_BlockEndpoints
=== PAUSE TestHTTPAPI_BlockEndpoints
=== RUN   TestHTTPAPI_Ban_Nonprintable_Characters
--- SKIP: TestHTTPAPI_Ban_Nonprintable_Characters (0.00s)
    http_test.go:324: DM-skipped
=== RUN   TestHTTPAPI_Allow_Nonprintable_Characters_With_Flag
--- SKIP: TestHTTPAPI_Allow_Nonprintable_Characters_With_Flag (0.00s)
    http_test.go:344: DM-skipped
=== RUN   TestHTTPAPI_TranslateAddrHeader
=== PAUSE TestHTTPAPI_TranslateAddrHeader
=== RUN   TestHTTPAPIResponseHeaders
=== PAUSE TestHTTPAPIResponseHeaders
=== RUN   TestContentTypeIsJSON
=== PAUSE TestContentTypeIsJSON
=== RUN   TestHTTP_wrap_obfuscateLog
=== PAUSE TestHTTP_wrap_obfuscateLog
=== RUN   TestPrettyPrint
=== PAUSE TestPrettyPrint
=== RUN   TestPrettyPrintBare
=== PAUSE TestPrettyPrintBare
=== RUN   TestParseSource
=== PAUSE TestParseSource
=== RUN   TestParseCacheControl
=== RUN   TestParseCacheControl/empty_header
=== RUN   TestParseCacheControl/simple_max-age
=== RUN   TestParseCacheControl/zero_max-age
=== RUN   TestParseCacheControl/must-revalidate
=== RUN   TestParseCacheControl/mixes_age,_must-revalidate
=== RUN   TestParseCacheControl/quoted_max-age
=== RUN   TestParseCacheControl/mixed_case_max-age
=== RUN   TestParseCacheControl/simple_stale-if-error
=== RUN   TestParseCacheControl/combined_with_space
=== RUN   TestParseCacheControl/combined_no_space
=== RUN   TestParseCacheControl/unsupported_directive
=== RUN   TestParseCacheControl/mixed_unsupported_directive
=== RUN   TestParseCacheControl/garbage_value
=== RUN   TestParseCacheControl/garbage_value_with_quotes
--- PASS: TestParseCacheControl (0.01s)
    --- PASS: TestParseCacheControl/empty_header (0.00s)
    --- PASS: TestParseCacheControl/simple_max-age (0.00s)
    --- PASS: TestParseCacheControl/zero_max-age (0.00s)
    --- PASS: TestParseCacheControl/must-revalidate (0.00s)
    --- PASS: TestParseCacheControl/mixes_age,_must-revalidate (0.00s)
    --- PASS: TestParseCacheControl/quoted_max-age (0.00s)
    --- PASS: TestParseCacheControl/mixed_case_max-age (0.00s)
    --- PASS: TestParseCacheControl/simple_stale-if-error (0.00s)
    --- PASS: TestParseCacheControl/combined_with_space (0.00s)
    --- PASS: TestParseCacheControl/combined_no_space (0.00s)
    --- PASS: TestParseCacheControl/unsupported_directive (0.00s)
    --- PASS: TestParseCacheControl/mixed_unsupported_directive (0.00s)
    --- PASS: TestParseCacheControl/garbage_value (0.00s)
    --- PASS: TestParseCacheControl/garbage_value_with_quotes (0.00s)
=== RUN   TestParseWait
=== PAUSE TestParseWait
=== RUN   TestPProfHandlers_EnableDebug
=== PAUSE TestPProfHandlers_EnableDebug
=== RUN   TestPProfHandlers_DisableDebugNoACLs
--- SKIP: TestPProfHandlers_DisableDebugNoACLs (0.00s)
    http_test.go:761: DM-skipped
=== RUN   TestPProfHandlers_ACLs
=== PAUSE TestPProfHandlers_ACLs
=== RUN   TestParseWait_InvalidTime
=== PAUSE TestParseWait_InvalidTime
=== RUN   TestParseWait_InvalidIndex
=== PAUSE TestParseWait_InvalidIndex
=== RUN   TestParseConsistency
=== PAUSE TestParseConsistency
=== RUN   TestParseConsistencyAndMaxStale
WARNING: bootstrap = true: do not enable unless necessary
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:13.796742 [WARN] agent: Node name "Node 8e04138e-4f12-62f3-0c69-2f24c138452e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:13.797309 [DEBUG] tlsutil: Update with version 1
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:13.800261 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8e04138e-4f12-62f3-0c69-2f24c138452e Address:127.0.0.1:17656}]
2019/12/30 18:52:15 [INFO]  raft: Node at 127.0.0.1:17656 [Follower] entering Follower state (Leader: "")
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.120782 [INFO] serf: EventMemberJoin: Node 8e04138e-4f12-62f3-0c69-2f24c138452e.dc1 127.0.0.1
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.125517 [INFO] serf: EventMemberJoin: Node 8e04138e-4f12-62f3-0c69-2f24c138452e 127.0.0.1
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.126619 [INFO] consul: Adding LAN server Node 8e04138e-4f12-62f3-0c69-2f24c138452e (Addr: tcp/127.0.0.1:17656) (DC: dc1)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.126707 [INFO] consul: Handled member-join event for server "Node 8e04138e-4f12-62f3-0c69-2f24c138452e.dc1" in area "wan"
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.127675 [INFO] agent: Started DNS server 127.0.0.1:17651 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.127996 [INFO] agent: Started DNS server 127.0.0.1:17651 (udp)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.130528 [INFO] agent: Started HTTP server on 127.0.0.1:17652 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:15.130668 [INFO] agent: started state syncer
2019/12/30 18:52:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:15 [INFO]  raft: Node at 127.0.0.1:17656 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:16 [INFO]  raft: Node at 127.0.0.1:17656 [Leader] entering Leader state
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.654746 [INFO] consul: cluster leadership acquired
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.655205 [INFO] consul: New leader elected: Node 8e04138e-4f12-62f3-0c69-2f24c138452e
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.740908 [INFO] agent: Requesting shutdown
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.741018 [INFO] consul: shutting down server
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.741069 [WARN] serf: Shutdown without a Leave
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.741460 [ERR] agent: failed to sync remote state: No cluster leader
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.879192 [WARN] serf: Shutdown without a Leave
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:16.970978 [INFO] manager: shutting down
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.054329 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.054687 [INFO] agent: consul server down
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.054740 [INFO] agent: shutdown complete
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.054793 [INFO] agent: Stopping DNS server 127.0.0.1:17651 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.054953 [INFO] agent: Stopping DNS server 127.0.0.1:17651 (udp)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.055124 [INFO] agent: Stopping HTTP server 127.0.0.1:17652 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.055357 [INFO] agent: Waiting for endpoints to shut down
TestParseConsistencyAndMaxStale - 2019/12/30 18:52:17.055441 [INFO] agent: Endpoints down
--- PASS: TestParseConsistencyAndMaxStale (3.38s)
=== RUN   TestParseConsistency_Invalid
=== PAUSE TestParseConsistency_Invalid
=== RUN   TestACLResolution
=== PAUSE TestACLResolution
=== RUN   TestEnableWebUI
=== PAUSE TestEnableWebUI
=== RUN   TestParseToken_ProxyTokenResolve
=== PAUSE TestParseToken_ProxyTokenResolve
=== RUN   TestAllowedNets
--- SKIP: TestAllowedNets (0.00s)
    http_test.go:1227: DM-skipped
=== RUN   TestIntentionsList_empty
=== PAUSE TestIntentionsList_empty
=== RUN   TestIntentionsList_values
=== PAUSE TestIntentionsList_values
=== RUN   TestIntentionsMatch_basic
=== PAUSE TestIntentionsMatch_basic
=== RUN   TestIntentionsMatch_noBy
=== PAUSE TestIntentionsMatch_noBy
=== RUN   TestIntentionsMatch_byInvalid
=== PAUSE TestIntentionsMatch_byInvalid
=== RUN   TestIntentionsMatch_noName
=== PAUSE TestIntentionsMatch_noName
=== RUN   TestIntentionsCheck_basic
=== PAUSE TestIntentionsCheck_basic
=== RUN   TestIntentionsCheck_noSource
=== PAUSE TestIntentionsCheck_noSource
=== RUN   TestIntentionsCheck_noDestination
=== PAUSE TestIntentionsCheck_noDestination
=== RUN   TestIntentionsCreate_good
=== PAUSE TestIntentionsCreate_good
=== RUN   TestIntentionsCreate_noBody
=== PAUSE TestIntentionsCreate_noBody
=== RUN   TestIntentionsSpecificGet_good
=== PAUSE TestIntentionsSpecificGet_good
=== RUN   TestIntentionsSpecificGet_invalidId
=== PAUSE TestIntentionsSpecificGet_invalidId
=== RUN   TestIntentionsSpecificUpdate_good
=== PAUSE TestIntentionsSpecificUpdate_good
=== RUN   TestIntentionsSpecificDelete_good
=== PAUSE TestIntentionsSpecificDelete_good
=== RUN   TestParseIntentionMatchEntry
=== RUN   TestParseIntentionMatchEntry/foo
=== RUN   TestParseIntentionMatchEntry/foo/bar
=== RUN   TestParseIntentionMatchEntry/foo/bar/baz
--- PASS: TestParseIntentionMatchEntry (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo/bar (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo/bar/baz (0.00s)
=== RUN   TestAgent_LoadKeyrings
=== PAUSE TestAgent_LoadKeyrings
=== RUN   TestAgent_InmemKeyrings
=== PAUSE TestAgent_InmemKeyrings
=== RUN   TestAgent_InitKeyring
=== PAUSE TestAgent_InitKeyring
=== RUN   TestAgentKeyring_ACL
=== PAUSE TestAgentKeyring_ACL
=== RUN   TestKVSEndpoint_PUT_GET_DELETE
=== PAUSE TestKVSEndpoint_PUT_GET_DELETE
=== RUN   TestKVSEndpoint_Recurse
=== PAUSE TestKVSEndpoint_Recurse
=== RUN   TestKVSEndpoint_DELETE_CAS
=== PAUSE TestKVSEndpoint_DELETE_CAS
=== RUN   TestKVSEndpoint_CAS
=== PAUSE TestKVSEndpoint_CAS
=== RUN   TestKVSEndpoint_ListKeys
=== PAUSE TestKVSEndpoint_ListKeys
=== RUN   TestKVSEndpoint_AcquireRelease
=== PAUSE TestKVSEndpoint_AcquireRelease
=== RUN   TestKVSEndpoint_GET_Raw
=== PAUSE TestKVSEndpoint_GET_Raw
=== RUN   TestKVSEndpoint_PUT_ConflictingFlags
=== PAUSE TestKVSEndpoint_PUT_ConflictingFlags
=== RUN   TestKVSEndpoint_DELETE_ConflictingFlags
=== PAUSE TestKVSEndpoint_DELETE_ConflictingFlags
=== RUN   TestNotifyGroup
--- PASS: TestNotifyGroup (0.00s)
=== RUN   TestNotifyGroup_Clear
--- PASS: TestNotifyGroup_Clear (0.00s)
=== RUN   TestOperator_RaftConfiguration
=== PAUSE TestOperator_RaftConfiguration
=== RUN   TestOperator_RaftPeer
=== PAUSE TestOperator_RaftPeer
=== RUN   TestOperator_KeyringInstall
=== PAUSE TestOperator_KeyringInstall
=== RUN   TestOperator_KeyringList
=== PAUSE TestOperator_KeyringList
=== RUN   TestOperator_KeyringRemove
=== PAUSE TestOperator_KeyringRemove
=== RUN   TestOperator_KeyringUse
=== PAUSE TestOperator_KeyringUse
=== RUN   TestOperator_Keyring_InvalidRelayFactor
=== PAUSE TestOperator_Keyring_InvalidRelayFactor
=== RUN   TestOperator_AutopilotGetConfiguration
=== PAUSE TestOperator_AutopilotGetConfiguration
=== RUN   TestOperator_AutopilotSetConfiguration
--- SKIP: TestOperator_AutopilotSetConfiguration (0.00s)
    operator_endpoint_test.go:318: DM-skipped
=== RUN   TestOperator_AutopilotCASConfiguration
=== PAUSE TestOperator_AutopilotCASConfiguration
=== RUN   TestOperator_ServerHealth
=== PAUSE TestOperator_ServerHealth
=== RUN   TestOperator_ServerHealth_Unhealthy
=== PAUSE TestOperator_ServerHealth_Unhealthy
=== RUN   TestPreparedQuery_Create
=== PAUSE TestPreparedQuery_Create
=== RUN   TestPreparedQuery_List
=== PAUSE TestPreparedQuery_List
=== RUN   TestPreparedQuery_Execute
=== PAUSE TestPreparedQuery_Execute
=== RUN   TestPreparedQuery_ExecuteCached
=== PAUSE TestPreparedQuery_ExecuteCached
=== RUN   TestPreparedQuery_Explain
=== PAUSE TestPreparedQuery_Explain
=== RUN   TestPreparedQuery_Get
=== PAUSE TestPreparedQuery_Get
=== RUN   TestPreparedQuery_Update
=== PAUSE TestPreparedQuery_Update
=== RUN   TestPreparedQuery_Delete
=== PAUSE TestPreparedQuery_Delete
=== RUN   TestPreparedQuery_parseLimit
=== PAUSE TestPreparedQuery_parseLimit
=== RUN   TestPreparedQuery_Integration
--- SKIP: TestPreparedQuery_Integration (0.00s)
    prepared_query_endpoint_test.go:990: DM-skipped
=== RUN   TestRexecWriter
--- SKIP: TestRexecWriter (0.00s)
    remote_exec_test.go:28: DM-skipped
=== RUN   TestRemoteExecGetSpec
=== PAUSE TestRemoteExecGetSpec
=== RUN   TestRemoteExecGetSpec_ACLToken
=== PAUSE TestRemoteExecGetSpec_ACLToken
=== RUN   TestRemoteExecGetSpec_ACLAgentToken
=== PAUSE TestRemoteExecGetSpec_ACLAgentToken
=== RUN   TestRemoteExecGetSpec_ACLDeny
=== PAUSE TestRemoteExecGetSpec_ACLDeny
=== RUN   TestRemoteExecWrites
=== PAUSE TestRemoteExecWrites
=== RUN   TestRemoteExecWrites_ACLToken
=== PAUSE TestRemoteExecWrites_ACLToken
=== RUN   TestRemoteExecWrites_ACLAgentToken
=== PAUSE TestRemoteExecWrites_ACLAgentToken
=== RUN   TestRemoteExecWrites_ACLDeny
=== PAUSE TestRemoteExecWrites_ACLDeny
=== RUN   TestHandleRemoteExec
=== PAUSE TestHandleRemoteExec
=== RUN   TestHandleRemoteExecFailed
=== PAUSE TestHandleRemoteExecFailed
=== RUN   TestServiceManager_RegisterService
WARNING: bootstrap = true: do not enable unless necessary
TestServiceManager_RegisterService - 2019/12/30 18:52:17.153881 [WARN] agent: Node name "Node 5a72e188-a688-abb7-43e1-ebe9be26233f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceManager_RegisterService - 2019/12/30 18:52:17.154521 [DEBUG] tlsutil: Update with version 1
TestServiceManager_RegisterService - 2019/12/30 18:52:17.156939 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5a72e188-a688-abb7-43e1-ebe9be26233f Address:127.0.0.1:17662}]
2019/12/30 18:52:17 [INFO]  raft: Node at 127.0.0.1:17662 [Follower] entering Follower state (Leader: "")
TestServiceManager_RegisterService - 2019/12/30 18:52:17.927614 [INFO] serf: EventMemberJoin: Node 5a72e188-a688-abb7-43e1-ebe9be26233f.dc1 127.0.0.1
TestServiceManager_RegisterService - 2019/12/30 18:52:17.952197 [INFO] serf: EventMemberJoin: Node 5a72e188-a688-abb7-43e1-ebe9be26233f 127.0.0.1
TestServiceManager_RegisterService - 2019/12/30 18:52:17.953215 [INFO] consul: Handled member-join event for server "Node 5a72e188-a688-abb7-43e1-ebe9be26233f.dc1" in area "wan"
TestServiceManager_RegisterService - 2019/12/30 18:52:17.953688 [INFO] consul: Adding LAN server Node 5a72e188-a688-abb7-43e1-ebe9be26233f (Addr: tcp/127.0.0.1:17662) (DC: dc1)
TestServiceManager_RegisterService - 2019/12/30 18:52:17.954343 [INFO] agent: Started DNS server 127.0.0.1:17657 (tcp)
TestServiceManager_RegisterService - 2019/12/30 18:52:17.955093 [INFO] agent: Started DNS server 127.0.0.1:17657 (udp)
TestServiceManager_RegisterService - 2019/12/30 18:52:17.959061 [INFO] agent: Started HTTP server on 127.0.0.1:17658 (tcp)
TestServiceManager_RegisterService - 2019/12/30 18:52:17.959179 [INFO] agent: started state syncer
2019/12/30 18:52:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:17 [INFO]  raft: Node at 127.0.0.1:17662 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:18 [INFO]  raft: Node at 127.0.0.1:17662 [Leader] entering Leader state
TestServiceManager_RegisterService - 2019/12/30 18:52:18.438225 [INFO] consul: cluster leadership acquired
TestServiceManager_RegisterService - 2019/12/30 18:52:18.438672 [INFO] consul: New leader elected: Node 5a72e188-a688-abb7-43e1-ebe9be26233f
TestServiceManager_RegisterService - 2019/12/30 18:52:18.772744 [INFO] agent: Synced node info
TestServiceManager_RegisterService - 2019/12/30 18:52:18.772861 [DEBUG] agent: Node info in sync
TestServiceManager_RegisterService - 2019/12/30 18:52:19.196745 [DEBUG] agent: Node info in sync
TestServiceManager_RegisterService - 2019/12/30 18:52:20.796958 [INFO] agent: Synced service "redis"
TestServiceManager_RegisterService - 2019/12/30 18:52:20.797078 [DEBUG] agent: Node info in sync
TestServiceManager_RegisterService - 2019/12/30 18:52:20.797475 [INFO] agent: Requesting shutdown
TestServiceManager_RegisterService - 2019/12/30 18:52:20.797534 [DEBUG] agent: Service "redis" in sync
TestServiceManager_RegisterService - 2019/12/30 18:52:20.797585 [DEBUG] agent: Node info in sync
TestServiceManager_RegisterService - 2019/12/30 18:52:20.797548 [INFO] consul: shutting down server
TestServiceManager_RegisterService - 2019/12/30 18:52:20.797721 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterService - 2019/12/30 18:52:20.970948 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterService - 2019/12/30 18:52:21.021338 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestServiceManager_RegisterService - 2019/12/30 18:52:21.028826 [WARN] consul: error getting server health from "Node 5a72e188-a688-abb7-43e1-ebe9be26233f": rpc error making call: EOF
TestServiceManager_RegisterService - 2019/12/30 18:52:21.104459 [INFO] manager: shutting down
TestServiceManager_RegisterService - 2019/12/30 18:52:21.229496 [ERR] connect: Apply failed leadership lost while committing log
TestServiceManager_RegisterService - 2019/12/30 18:52:21.229583 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestServiceManager_RegisterService - 2019/12/30 18:52:21.230635 [INFO] agent: consul server down
TestServiceManager_RegisterService - 2019/12/30 18:52:21.230740 [INFO] agent: shutdown complete
TestServiceManager_RegisterService - 2019/12/30 18:52:21.230992 [INFO] agent: Stopping DNS server 127.0.0.1:17657 (tcp)
TestServiceManager_RegisterService - 2019/12/30 18:52:21.231295 [INFO] agent: Stopping DNS server 127.0.0.1:17657 (udp)
TestServiceManager_RegisterService - 2019/12/30 18:52:21.231597 [INFO] agent: Stopping HTTP server 127.0.0.1:17658 (tcp)
TestServiceManager_RegisterService - 2019/12/30 18:52:21.231995 [INFO] agent: Waiting for endpoints to shut down
TestServiceManager_RegisterService - 2019/12/30 18:52:21.232090 [INFO] agent: Endpoints down
--- PASS: TestServiceManager_RegisterService (4.16s)
=== RUN   TestServiceManager_RegisterSidecar
WARNING: bootstrap = true: do not enable unless necessary
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:21.294887 [WARN] agent: Node name "Node bcd525ce-f7ab-bbac-49f4-d447c080a222" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:21.295356 [DEBUG] tlsutil: Update with version 1
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:21.297640 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestServiceManager_RegisterService - 2019/12/30 18:52:22.021778 [WARN] consul: error getting server health from "Node 5a72e188-a688-abb7-43e1-ebe9be26233f": context deadline exceeded
TestServiceManager_RegisterService - 2019/12/30 18:52:22.022055 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
2019/12/30 18:52:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bcd525ce-f7ab-bbac-49f4-d447c080a222 Address:127.0.0.1:17668}]
2019/12/30 18:52:22 [INFO]  raft: Node at 127.0.0.1:17668 [Follower] entering Follower state (Leader: "")
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.041712 [INFO] serf: EventMemberJoin: Node bcd525ce-f7ab-bbac-49f4-d447c080a222.dc1 127.0.0.1
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.045200 [INFO] serf: EventMemberJoin: Node bcd525ce-f7ab-bbac-49f4-d447c080a222 127.0.0.1
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.046064 [INFO] consul: Handled member-join event for server "Node bcd525ce-f7ab-bbac-49f4-d447c080a222.dc1" in area "wan"
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.046363 [INFO] consul: Adding LAN server Node bcd525ce-f7ab-bbac-49f4-d447c080a222 (Addr: tcp/127.0.0.1:17668) (DC: dc1)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.046642 [INFO] agent: Started DNS server 127.0.0.1:17663 (udp)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.046884 [INFO] agent: Started DNS server 127.0.0.1:17663 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.049242 [INFO] agent: Started HTTP server on 127.0.0.1:17664 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.050006 [INFO] agent: started state syncer
2019/12/30 18:52:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:22 [INFO]  raft: Node at 127.0.0.1:17668 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:22 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:22 [INFO]  raft: Node at 127.0.0.1:17668 [Leader] entering Leader state
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.572411 [INFO] consul: cluster leadership acquired
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.572824 [INFO] consul: New leader elected: Node bcd525ce-f7ab-bbac-49f4-d447c080a222
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:22.997994 [INFO] agent: Synced node info
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:23.992040 [DEBUG] agent.manager: added local registration for service "web-sidecar-proxy"
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.001766 [INFO] agent: Requesting shutdown
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.002410 [INFO] consul: shutting down server
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.002926 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.121025 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.197570 [INFO] manager: shutting down
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.197615 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.197991 [INFO] agent: consul server down
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.198045 [INFO] agent: shutdown complete
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.198097 [INFO] agent: Stopping DNS server 127.0.0.1:17663 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.198243 [INFO] agent: Stopping DNS server 127.0.0.1:17663 (udp)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.198403 [INFO] agent: Stopping HTTP server 127.0.0.1:17664 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.198600 [INFO] agent: Waiting for endpoints to shut down
TestServiceManager_RegisterSidecar - 2019/12/30 18:52:24.198676 [INFO] agent: Endpoints down
--- PASS: TestServiceManager_RegisterSidecar (2.97s)
=== RUN   TestServiceManager_Disabled
WARNING: bootstrap = true: do not enable unless necessary
TestServiceManager_Disabled - 2019/12/30 18:52:24.306834 [WARN] agent: Node name "Node 28349cf9-fea7-6430-7b7a-931bfe641950" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceManager_Disabled - 2019/12/30 18:52:24.307311 [DEBUG] tlsutil: Update with version 1
TestServiceManager_Disabled - 2019/12/30 18:52:24.319604 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:28349cf9-fea7-6430-7b7a-931bfe641950 Address:127.0.0.1:17674}]
2019/12/30 18:52:25 [INFO]  raft: Node at 127.0.0.1:17674 [Follower] entering Follower state (Leader: "")
TestServiceManager_Disabled - 2019/12/30 18:52:25.459055 [INFO] serf: EventMemberJoin: Node 28349cf9-fea7-6430-7b7a-931bfe641950.dc1 127.0.0.1
TestServiceManager_Disabled - 2019/12/30 18:52:25.464484 [INFO] serf: EventMemberJoin: Node 28349cf9-fea7-6430-7b7a-931bfe641950 127.0.0.1
TestServiceManager_Disabled - 2019/12/30 18:52:25.465522 [INFO] consul: Adding LAN server Node 28349cf9-fea7-6430-7b7a-931bfe641950 (Addr: tcp/127.0.0.1:17674) (DC: dc1)
TestServiceManager_Disabled - 2019/12/30 18:52:25.465766 [INFO] agent: Started DNS server 127.0.0.1:17669 (udp)
TestServiceManager_Disabled - 2019/12/30 18:52:25.465786 [INFO] consul: Handled member-join event for server "Node 28349cf9-fea7-6430-7b7a-931bfe641950.dc1" in area "wan"
TestServiceManager_Disabled - 2019/12/30 18:52:25.466168 [INFO] agent: Started DNS server 127.0.0.1:17669 (tcp)
TestServiceManager_Disabled - 2019/12/30 18:52:25.468401 [INFO] agent: Started HTTP server on 127.0.0.1:17670 (tcp)
TestServiceManager_Disabled - 2019/12/30 18:52:25.468483 [INFO] agent: started state syncer
2019/12/30 18:52:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:25 [INFO]  raft: Node at 127.0.0.1:17674 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:26 [INFO]  raft: Node at 127.0.0.1:17674 [Leader] entering Leader state
TestServiceManager_Disabled - 2019/12/30 18:52:26.372022 [INFO] consul: cluster leadership acquired
TestServiceManager_Disabled - 2019/12/30 18:52:26.372572 [INFO] consul: New leader elected: Node 28349cf9-fea7-6430-7b7a-931bfe641950
TestServiceManager_Disabled - 2019/12/30 18:52:26.947306 [INFO] agent: Synced node info
TestServiceManager_Disabled - 2019/12/30 18:52:26.947463 [DEBUG] agent: Node info in sync
TestServiceManager_Disabled - 2019/12/30 18:52:28.482805 [INFO] agent: Requesting shutdown
TestServiceManager_Disabled - 2019/12/30 18:52:28.483218 [INFO] consul: shutting down server
TestServiceManager_Disabled - 2019/12/30 18:52:28.483289 [WARN] serf: Shutdown without a Leave
TestServiceManager_Disabled - 2019/12/30 18:52:28.655984 [WARN] serf: Shutdown without a Leave
TestServiceManager_Disabled - 2019/12/30 18:52:28.746246 [INFO] manager: shutting down
TestServiceManager_Disabled - 2019/12/30 18:52:28.747098 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestServiceManager_Disabled - 2019/12/30 18:52:28.747306 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestServiceManager_Disabled - 2019/12/30 18:52:28.747373 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestServiceManager_Disabled - 2019/12/30 18:52:28.747833 [INFO] agent: consul server down
TestServiceManager_Disabled - 2019/12/30 18:52:28.748511 [INFO] agent: shutdown complete
TestServiceManager_Disabled - 2019/12/30 18:52:28.749236 [INFO] agent: Stopping DNS server 127.0.0.1:17669 (tcp)
TestServiceManager_Disabled - 2019/12/30 18:52:28.750060 [INFO] agent: Stopping DNS server 127.0.0.1:17669 (udp)
TestServiceManager_Disabled - 2019/12/30 18:52:28.750762 [INFO] agent: Stopping HTTP server 127.0.0.1:17670 (tcp)
TestServiceManager_Disabled - 2019/12/30 18:52:28.751981 [INFO] agent: Waiting for endpoints to shut down
TestServiceManager_Disabled - 2019/12/30 18:52:28.753696 [INFO] agent: Endpoints down
--- PASS: TestServiceManager_Disabled (4.56s)
=== RUN   TestSessionCreate
=== PAUSE TestSessionCreate
=== RUN   TestSessionCreate_Delete
=== PAUSE TestSessionCreate_Delete
=== RUN   TestSessionCreate_DefaultCheck
=== PAUSE TestSessionCreate_DefaultCheck
=== RUN   TestSessionCreate_NoCheck
=== PAUSE TestSessionCreate_NoCheck
=== RUN   TestFixupLockDelay
=== PAUSE TestFixupLockDelay
=== RUN   TestSessionDestroy
=== PAUSE TestSessionDestroy
=== RUN   TestSessionCustomTTL
=== PAUSE TestSessionCustomTTL
=== RUN   TestSessionTTLRenew
--- SKIP: TestSessionTTLRenew (0.00s)
    session_endpoint_test.go:372: DM-skipped
=== RUN   TestSessionGet
=== PAUSE TestSessionGet
=== RUN   TestSessionList
=== RUN   TestSessionList/#00
WARNING: bootstrap = true: do not enable unless necessary
TestSessionList/#00 - 2019/12/30 18:52:28.820248 [WARN] agent: Node name "Node d01c7744-e8cd-ea09-edcc-308f437e856e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSessionList/#00 - 2019/12/30 18:52:28.820964 [DEBUG] tlsutil: Update with version 1
TestSessionList/#00 - 2019/12/30 18:52:28.823437 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d01c7744-e8cd-ea09-edcc-308f437e856e Address:127.0.0.1:17680}]
2019/12/30 18:52:30 [INFO]  raft: Node at 127.0.0.1:17680 [Follower] entering Follower state (Leader: "")
TestSessionList/#00 - 2019/12/30 18:52:30.175481 [INFO] serf: EventMemberJoin: Node d01c7744-e8cd-ea09-edcc-308f437e856e.dc1 127.0.0.1
TestSessionList/#00 - 2019/12/30 18:52:30.180402 [INFO] serf: EventMemberJoin: Node d01c7744-e8cd-ea09-edcc-308f437e856e 127.0.0.1
TestSessionList/#00 - 2019/12/30 18:52:30.181286 [INFO] consul: Adding LAN server Node d01c7744-e8cd-ea09-edcc-308f437e856e (Addr: tcp/127.0.0.1:17680) (DC: dc1)
TestSessionList/#00 - 2019/12/30 18:52:30.181716 [INFO] consul: Handled member-join event for server "Node d01c7744-e8cd-ea09-edcc-308f437e856e.dc1" in area "wan"
TestSessionList/#00 - 2019/12/30 18:52:30.181810 [INFO] agent: Started DNS server 127.0.0.1:17675 (udp)
TestSessionList/#00 - 2019/12/30 18:52:30.182157 [INFO] agent: Started DNS server 127.0.0.1:17675 (tcp)
TestSessionList/#00 - 2019/12/30 18:52:30.184627 [INFO] agent: Started HTTP server on 127.0.0.1:17676 (tcp)
TestSessionList/#00 - 2019/12/30 18:52:30.184745 [INFO] agent: started state syncer
2019/12/30 18:52:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:30 [INFO]  raft: Node at 127.0.0.1:17680 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:30 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:30 [INFO]  raft: Node at 127.0.0.1:17680 [Leader] entering Leader state
TestSessionList/#00 - 2019/12/30 18:52:30.688298 [INFO] consul: cluster leadership acquired
TestSessionList/#00 - 2019/12/30 18:52:30.688783 [INFO] consul: New leader elected: Node d01c7744-e8cd-ea09-edcc-308f437e856e
TestSessionList/#00 - 2019/12/30 18:52:31.015343 [INFO] agent: Synced node info
TestSessionList/#00 - 2019/12/30 18:52:31.863521 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSessionList/#00 - 2019/12/30 18:52:31.864008 [DEBUG] consul: Skipping self join check for "Node d01c7744-e8cd-ea09-edcc-308f437e856e" since the cluster is too small
TestSessionList/#00 - 2019/12/30 18:52:31.864162 [INFO] consul: member 'Node d01c7744-e8cd-ea09-edcc-308f437e856e' joined, marking health alive
TestSessionList/#00 - 2019/12/30 18:52:32.048436 [INFO] agent: Requesting shutdown
TestSessionList/#00 - 2019/12/30 18:52:32.048555 [INFO] consul: shutting down server
TestSessionList/#00 - 2019/12/30 18:52:32.048624 [WARN] serf: Shutdown without a Leave
TestSessionList/#00 - 2019/12/30 18:52:32.096227 [WARN] serf: Shutdown without a Leave
TestSessionList/#00 - 2019/12/30 18:52:32.146236 [INFO] manager: shutting down
TestSessionList/#00 - 2019/12/30 18:52:32.146676 [INFO] agent: consul server down
TestSessionList/#00 - 2019/12/30 18:52:32.146731 [INFO] agent: shutdown complete
TestSessionList/#00 - 2019/12/30 18:52:32.146790 [INFO] agent: Stopping DNS server 127.0.0.1:17675 (tcp)
TestSessionList/#00 - 2019/12/30 18:52:32.146944 [INFO] agent: Stopping DNS server 127.0.0.1:17675 (udp)
TestSessionList/#00 - 2019/12/30 18:52:32.147120 [INFO] agent: Stopping HTTP server 127.0.0.1:17676 (tcp)
TestSessionList/#00 - 2019/12/30 18:52:32.147349 [INFO] agent: Waiting for endpoints to shut down
TestSessionList/#00 - 2019/12/30 18:52:32.147420 [INFO] agent: Endpoints down
=== RUN   TestSessionList/#01
WARNING: bootstrap = true: do not enable unless necessary
TestSessionList/#01 - 2019/12/30 18:52:32.217267 [WARN] agent: Node name "Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSessionList/#01 - 2019/12/30 18:52:32.217835 [DEBUG] tlsutil: Update with version 1
TestSessionList/#01 - 2019/12/30 18:52:32.220216 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:444459a8-c1cf-15ec-6a3b-5aee5a7255ed Address:127.0.0.1:17686}]
2019/12/30 18:52:32 [INFO]  raft: Node at 127.0.0.1:17686 [Follower] entering Follower state (Leader: "")
TestSessionList/#01 - 2019/12/30 18:52:32.918252 [INFO] serf: EventMemberJoin: Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed.dc1 127.0.0.1
TestSessionList/#01 - 2019/12/30 18:52:32.923316 [INFO] serf: EventMemberJoin: Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed 127.0.0.1
TestSessionList/#01 - 2019/12/30 18:52:32.925108 [INFO] consul: Adding LAN server Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed (Addr: tcp/127.0.0.1:17686) (DC: dc1)
TestSessionList/#01 - 2019/12/30 18:52:32.925756 [INFO] consul: Handled member-join event for server "Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed.dc1" in area "wan"
TestSessionList/#01 - 2019/12/30 18:52:32.927879 [INFO] agent: Started DNS server 127.0.0.1:17681 (tcp)
TestSessionList/#01 - 2019/12/30 18:52:32.928270 [INFO] agent: Started DNS server 127.0.0.1:17681 (udp)
TestSessionList/#01 - 2019/12/30 18:52:32.930822 [INFO] agent: Started HTTP server on 127.0.0.1:17682 (tcp)
TestSessionList/#01 - 2019/12/30 18:52:32.930953 [INFO] agent: started state syncer
2019/12/30 18:52:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:32 [INFO]  raft: Node at 127.0.0.1:17686 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:33 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:33 [INFO]  raft: Node at 127.0.0.1:17686 [Leader] entering Leader state
TestSessionList/#01 - 2019/12/30 18:52:33.915050 [INFO] consul: cluster leadership acquired
TestSessionList/#01 - 2019/12/30 18:52:33.915480 [INFO] consul: New leader elected: Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed
TestSessionList/#01 - 2019/12/30 18:52:34.255963 [INFO] agent: Synced node info
TestSessionList/#01 - 2019/12/30 18:52:34.256104 [DEBUG] agent: Node info in sync
TestSessionList/#01 - 2019/12/30 18:52:34.513884 [DEBUG] agent: Node info in sync
TestSessionList/#01 - 2019/12/30 18:52:35.622688 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSessionList/#01 - 2019/12/30 18:52:35.623795 [DEBUG] consul: Skipping self join check for "Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed" since the cluster is too small
TestSessionList/#01 - 2019/12/30 18:52:35.623994 [INFO] consul: member 'Node 444459a8-c1cf-15ec-6a3b-5aee5a7255ed' joined, marking health alive
TestSessionList/#01 - 2019/12/30 18:52:37.538815 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSessionList/#01 - 2019/12/30 18:52:39.072455 [INFO] agent: Requesting shutdown
TestSessionList/#01 - 2019/12/30 18:52:39.072555 [INFO] consul: shutting down server
TestSessionList/#01 - 2019/12/30 18:52:39.072607 [WARN] serf: Shutdown without a Leave
TestSessionList/#01 - 2019/12/30 18:52:39.129743 [WARN] serf: Shutdown without a Leave
TestSessionList/#01 - 2019/12/30 18:52:39.179849 [INFO] manager: shutting down
TestSessionList/#01 - 2019/12/30 18:52:39.180601 [INFO] agent: consul server down
TestSessionList/#01 - 2019/12/30 18:52:39.180661 [INFO] agent: shutdown complete
TestSessionList/#01 - 2019/12/30 18:52:39.180715 [INFO] agent: Stopping DNS server 127.0.0.1:17681 (tcp)
TestSessionList/#01 - 2019/12/30 18:52:39.180852 [INFO] agent: Stopping DNS server 127.0.0.1:17681 (udp)
TestSessionList/#01 - 2019/12/30 18:52:39.180999 [INFO] agent: Stopping HTTP server 127.0.0.1:17682 (tcp)
TestSessionList/#01 - 2019/12/30 18:52:39.181278 [INFO] agent: Waiting for endpoints to shut down
TestSessionList/#01 - 2019/12/30 18:52:39.181337 [INFO] agent: Endpoints down
--- PASS: TestSessionList (10.43s)
    --- PASS: TestSessionList/#00 (3.39s)
    --- PASS: TestSessionList/#01 (7.03s)
=== RUN   TestSessionsForNode
=== PAUSE TestSessionsForNode
=== RUN   TestSessionDeleteDestroy
=== PAUSE TestSessionDeleteDestroy
=== RUN   TestAgent_sidecarServiceFromNodeService
=== RUN   TestAgent_sidecarServiceFromNodeService/no_sidecar
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:39.313136 [WARN] agent: Node name "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:39.313698 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:39.315962 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f3b42e12-d0ce-84ea-a863-97fa8b8c0786 Address:127.0.0.1:17692}]
2019/12/30 18:52:40 [INFO]  raft: Node at 127.0.0.1:17692 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:40.119630 [INFO] serf: EventMemberJoin: Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786.dc1 127.0.0.1
jones - 2019/12/30 18:52:40.123978 [INFO] serf: EventMemberJoin: Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786 127.0.0.1
jones - 2019/12/30 18:52:40.125682 [INFO] consul: Adding LAN server Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786 (Addr: tcp/127.0.0.1:17692) (DC: dc1)
jones - 2019/12/30 18:52:40.126241 [INFO] consul: Handled member-join event for server "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786.dc1" in area "wan"
jones - 2019/12/30 18:52:40.128642 [INFO] agent: Started DNS server 127.0.0.1:17687 (tcp)
jones - 2019/12/30 18:52:40.128732 [INFO] agent: Started DNS server 127.0.0.1:17687 (udp)
jones - 2019/12/30 18:52:40.131850 [INFO] agent: Started HTTP server on 127.0.0.1:17688 (tcp)
jones - 2019/12/30 18:52:40.131958 [INFO] agent: started state syncer
2019/12/30 18:52:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:40 [INFO]  raft: Node at 127.0.0.1:17692 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:40 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:40 [INFO]  raft: Node at 127.0.0.1:17692 [Leader] entering Leader state
jones - 2019/12/30 18:52:40.709545 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:40.709972 [INFO] consul: New leader elected: Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786
=== RUN   TestAgent_sidecarServiceFromNodeService/all_the_defaults
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:40.840785 [WARN] agent: Node name "Node 90e88a15-5862-4de0-2f1f-c638261bac76" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:40.841610 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:40.844267 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:41.039151 [INFO] agent: Synced node info
jones - 2019/12/30 18:52:41.836078 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:41.836199 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:42.305517 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:42.306052 [DEBUG] consul: Skipping self join check for "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" since the cluster is too small
jones - 2019/12/30 18:52:42.306250 [INFO] consul: member 'Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786' joined, marking health alive
2019/12/30 18:52:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:90e88a15-5862-4de0-2f1f-c638261bac76 Address:127.0.0.1:17698}]
2019/12/30 18:52:42 [INFO]  raft: Node at 127.0.0.1:17698 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:42.401793 [INFO] serf: EventMemberJoin: Node 90e88a15-5862-4de0-2f1f-c638261bac76.dc1 127.0.0.1
jones - 2019/12/30 18:52:42.407120 [INFO] serf: EventMemberJoin: Node 90e88a15-5862-4de0-2f1f-c638261bac76 127.0.0.1
jones - 2019/12/30 18:52:42.407963 [INFO] consul: Adding LAN server Node 90e88a15-5862-4de0-2f1f-c638261bac76 (Addr: tcp/127.0.0.1:17698) (DC: dc1)
jones - 2019/12/30 18:52:42.408090 [INFO] consul: Handled member-join event for server "Node 90e88a15-5862-4de0-2f1f-c638261bac76.dc1" in area "wan"
jones - 2019/12/30 18:52:42.410403 [INFO] agent: Started DNS server 127.0.0.1:17693 (tcp)
jones - 2019/12/30 18:52:42.410621 [INFO] agent: Started DNS server 127.0.0.1:17693 (udp)
jones - 2019/12/30 18:52:42.413109 [INFO] agent: Started HTTP server on 127.0.0.1:17694 (tcp)
jones - 2019/12/30 18:52:42.413218 [INFO] agent: started state syncer
2019/12/30 18:52:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:42 [INFO]  raft: Node at 127.0.0.1:17698 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:42 [INFO]  raft: Node at 127.0.0.1:17698 [Leader] entering Leader state
jones - 2019/12/30 18:52:42.997014 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:42.997482 [INFO] consul: New leader elected: Node 90e88a15-5862-4de0-2f1f-c638261bac76
=== RUN   TestAgent_sidecarServiceFromNodeService/all_the_allowed_overrides
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:43.095448 [WARN] agent: Node name "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:43.096115 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:43.098329 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:43.214822 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:43.383126 [INFO] agent: Synced node info
2019/12/30 18:52:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:39b35a7a-e61e-e83c-e3aa-2917305b86d6 Address:127.0.0.1:17704}]
jones - 2019/12/30 18:52:44.067243 [INFO] serf: EventMemberJoin: Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6.dc1 127.0.0.1
2019/12/30 18:52:44 [INFO]  raft: Node at 127.0.0.1:17704 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:44.072165 [INFO] serf: EventMemberJoin: Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6 127.0.0.1
jones - 2019/12/30 18:52:44.073514 [INFO] consul: Adding LAN server Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6 (Addr: tcp/127.0.0.1:17704) (DC: dc1)
jones - 2019/12/30 18:52:44.074089 [INFO] consul: Handled member-join event for server "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6.dc1" in area "wan"
jones - 2019/12/30 18:52:44.075964 [INFO] agent: Started DNS server 127.0.0.1:17699 (tcp)
jones - 2019/12/30 18:52:44.076154 [INFO] agent: Started DNS server 127.0.0.1:17699 (udp)
jones - 2019/12/30 18:52:44.078649 [INFO] agent: Started HTTP server on 127.0.0.1:17700 (tcp)
jones - 2019/12/30 18:52:44.078756 [INFO] agent: started state syncer
2019/12/30 18:52:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:44 [INFO]  raft: Node at 127.0.0.1:17704 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:52:44.514105 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:44.515051 [DEBUG] consul: Skipping self join check for "Node 90e88a15-5862-4de0-2f1f-c638261bac76" since the cluster is too small
jones - 2019/12/30 18:52:44.515300 [INFO] consul: member 'Node 90e88a15-5862-4de0-2f1f-c638261bac76' joined, marking health alive
2019/12/30 18:52:45 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:45 [INFO]  raft: Node at 127.0.0.1:17704 [Leader] entering Leader state
jones - 2019/12/30 18:52:45.021992 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:45.022547 [INFO] consul: New leader elected: Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6
=== RUN   TestAgent_sidecarServiceFromNodeService/no_auto_ports_available
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:45.280740 [WARN] agent: Node name "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:45.281472 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:45.284919 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:45.464060 [INFO] agent: Synced node info
jones - 2019/12/30 18:52:45.468055 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:45.542132 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:45.857295 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:46.099355 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:52:46.099511 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:46.099594 [DEBUG] agent: Node info in sync
2019/12/30 18:52:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:21fec6d0-1e2c-94e5-aea5-373e1f263aef Address:127.0.0.1:17710}]
2019/12/30 18:52:46 [INFO]  raft: Node at 127.0.0.1:17710 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:46.388433 [INFO] serf: EventMemberJoin: Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef.dc1 127.0.0.1
jones - 2019/12/30 18:52:46.391646 [INFO] serf: EventMemberJoin: Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef 127.0.0.1
jones - 2019/12/30 18:52:46.400718 [INFO] agent: Started DNS server 127.0.0.1:17705 (udp)
jones - 2019/12/30 18:52:46.400764 [INFO] consul: Adding LAN server Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef (Addr: tcp/127.0.0.1:17710) (DC: dc1)
jones - 2019/12/30 18:52:46.400928 [INFO] consul: Handled member-join event for server "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef.dc1" in area "wan"
jones - 2019/12/30 18:52:46.401273 [INFO] agent: Started DNS server 127.0.0.1:17705 (tcp)
jones - 2019/12/30 18:52:46.404139 [INFO] agent: Started HTTP server on 127.0.0.1:17706 (tcp)
jones - 2019/12/30 18:52:46.404238 [INFO] agent: started state syncer
2019/12/30 18:52:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:46 [INFO]  raft: Node at 127.0.0.1:17710 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:52:46.838941 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:46.839982 [DEBUG] consul: Skipping self join check for "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" since the cluster is too small
jones - 2019/12/30 18:52:46.840172 [INFO] consul: member 'Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6' joined, marking health alive
2019/12/30 18:52:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:47 [INFO]  raft: Node at 127.0.0.1:17710 [Leader] entering Leader state
jones - 2019/12/30 18:52:47.023222 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:47.023742 [INFO] consul: New leader elected: Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef
=== RUN   TestAgent_sidecarServiceFromNodeService/auto_ports_disabled
jones - 2019/12/30 18:52:47.149637 [ERR] leaf watch error: invalid type for leaf response: <nil>
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:47.228051 [WARN] agent: Node name "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:47.228696 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:47.231412 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:47.767164 [INFO] agent: Synced service "api-proxy-sidecar"
jones - 2019/12/30 18:52:47.767300 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:47.767435 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/30 18:52:47.767494 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:47.816845 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b8c654fc-da2b-ce2a-7f10-53f103af6c8d Address:127.0.0.1:17716}]
2019/12/30 18:52:48 [INFO]  raft: Node at 127.0.0.1:17716 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:48.492425 [INFO] serf: EventMemberJoin: Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d.dc1 127.0.0.1
jones - 2019/12/30 18:52:48.499344 [INFO] serf: EventMemberJoin: Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d 127.0.0.1
jones - 2019/12/30 18:52:48.500506 [INFO] consul: Adding LAN server Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d (Addr: tcp/127.0.0.1:17716) (DC: dc1)
jones - 2019/12/30 18:52:48.500907 [INFO] consul: Handled member-join event for server "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d.dc1" in area "wan"
jones - 2019/12/30 18:52:48.501528 [INFO] agent: Started DNS server 127.0.0.1:17711 (tcp)
jones - 2019/12/30 18:52:48.502402 [INFO] agent: Started DNS server 127.0.0.1:17711 (udp)
jones - 2019/12/30 18:52:48.504965 [INFO] agent: Started HTTP server on 127.0.0.1:17712 (tcp)
jones - 2019/12/30 18:52:48.505067 [INFO] agent: started state syncer
2019/12/30 18:52:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:48 [INFO]  raft: Node at 127.0.0.1:17716 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:49 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:49 [INFO]  raft: Node at 127.0.0.1:17716 [Leader] entering Leader state
jones - 2019/12/30 18:52:49.514496 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:49.514905 [INFO] consul: New leader elected: Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d
=== RUN   TestAgent_sidecarServiceFromNodeService/inherit_tags_and_meta
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:49.735193 [WARN] agent: Node name "Node 5122c9d8-8979-c841-956f-094a90e62880" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:49.735882 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:49.738236 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:49.760053 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:49.760462 [DEBUG] consul: Skipping self join check for "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" since the cluster is too small
jones - 2019/12/30 18:52:49.760622 [INFO] consul: member 'Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef' joined, marking health alive
jones - 2019/12/30 18:52:49.931413 [INFO] agent: Synced node info
jones - 2019/12/30 18:52:50.166880 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:50.405062 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:50.405189 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:50.441510 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:52:50.441601 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/30 18:52:50.441637 [DEBUG] agent: Node info in sync
2019/12/30 18:52:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5122c9d8-8979-c841-956f-094a90e62880 Address:127.0.0.1:17722}]
jones - 2019/12/30 18:52:51.017140 [INFO] serf: EventMemberJoin: Node 5122c9d8-8979-c841-956f-094a90e62880.dc1 127.0.0.1
2019/12/30 18:52:51 [INFO]  raft: Node at 127.0.0.1:17722 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:51.040983 [INFO] serf: EventMemberJoin: Node 5122c9d8-8979-c841-956f-094a90e62880 127.0.0.1
jones - 2019/12/30 18:52:51.042835 [INFO] consul: Adding LAN server Node 5122c9d8-8979-c841-956f-094a90e62880 (Addr: tcp/127.0.0.1:17722) (DC: dc1)
jones - 2019/12/30 18:52:51.044085 [INFO] consul: Handled member-join event for server "Node 5122c9d8-8979-c841-956f-094a90e62880.dc1" in area "wan"
jones - 2019/12/30 18:52:51.050022 [INFO] agent: Started DNS server 127.0.0.1:17717 (udp)
jones - 2019/12/30 18:52:51.050385 [INFO] agent: Started DNS server 127.0.0.1:17717 (tcp)
jones - 2019/12/30 18:52:51.052834 [INFO] agent: Started HTTP server on 127.0.0.1:17718 (tcp)
jones - 2019/12/30 18:52:51.052938 [INFO] agent: started state syncer
2019/12/30 18:52:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:51 [INFO]  raft: Node at 127.0.0.1:17722 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:52:51.225669 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:51.226170 [DEBUG] consul: Skipping self join check for "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" since the cluster is too small
jones - 2019/12/30 18:52:51.226325 [INFO] consul: member 'Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d' joined, marking health alive
jones - 2019/12/30 18:52:52.705389 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:53 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:53 [INFO]  raft: Node at 127.0.0.1:17722 [Leader] entering Leader state
jones - 2019/12/30 18:52:53.014204 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:53.014895 [INFO] consul: New leader elected: Node 5122c9d8-8979-c841-956f-094a90e62880
=== RUN   TestAgent_sidecarServiceFromNodeService/invalid_check_type
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:53.301403 [WARN] agent: Node name "Node a8b3e297-b53a-bcd0-efda-5addcd938805" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:53.301974 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:53.304730 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:53.647682 [INFO] agent: Synced node info
jones - 2019/12/30 18:52:53.752141 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:53.752274 [DEBUG] agent: Node info in sync
2019/12/30 18:52:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a8b3e297-b53a-bcd0-efda-5addcd938805 Address:127.0.0.1:17728}]
2019/12/30 18:52:54 [INFO]  raft: Node at 127.0.0.1:17728 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:54.667738 [INFO] serf: EventMemberJoin: Node a8b3e297-b53a-bcd0-efda-5addcd938805.dc1 127.0.0.1
jones - 2019/12/30 18:52:54.671143 [INFO] serf: EventMemberJoin: Node a8b3e297-b53a-bcd0-efda-5addcd938805 127.0.0.1
jones - 2019/12/30 18:52:54.671947 [INFO] consul: Handled member-join event for server "Node a8b3e297-b53a-bcd0-efda-5addcd938805.dc1" in area "wan"
jones - 2019/12/30 18:52:54.672338 [INFO] consul: Adding LAN server Node a8b3e297-b53a-bcd0-efda-5addcd938805 (Addr: tcp/127.0.0.1:17728) (DC: dc1)
jones - 2019/12/30 18:52:54.672480 [INFO] agent: Started DNS server 127.0.0.1:17723 (udp)
jones - 2019/12/30 18:52:54.672829 [INFO] agent: Started DNS server 127.0.0.1:17723 (tcp)
jones - 2019/12/30 18:52:54.675421 [INFO] agent: Started HTTP server on 127.0.0.1:17724 (tcp)
jones - 2019/12/30 18:52:54.675557 [INFO] agent: started state syncer
2019/12/30 18:52:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:54 [INFO]  raft: Node at 127.0.0.1:17728 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:52:55.372521 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:55.373124 [DEBUG] consul: Skipping self join check for "Node 5122c9d8-8979-c841-956f-094a90e62880" since the cluster is too small
jones - 2019/12/30 18:52:55.373280 [INFO] consul: member 'Node 5122c9d8-8979-c841-956f-094a90e62880' joined, marking health alive
2019/12/30 18:52:55 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:55 [INFO]  raft: Node at 127.0.0.1:17728 [Leader] entering Leader state
jones - 2019/12/30 18:52:55.572766 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:55.573239 [INFO] consul: New leader elected: Node a8b3e297-b53a-bcd0-efda-5addcd938805
jones - 2019/12/30 18:52:55.974992 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:56.224947 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/invalid_meta
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:56.299843 [WARN] agent: Node name "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:56.300386 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:56.302803 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:57.106421 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:57.106904 [DEBUG] consul: Skipping self join check for "Node a8b3e297-b53a-bcd0-efda-5addcd938805" since the cluster is too small
jones - 2019/12/30 18:52:57.107070 [INFO] consul: member 'Node a8b3e297-b53a-bcd0-efda-5addcd938805' joined, marking health alive
2019/12/30 18:52:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4707dfb8-f5da-73eb-b8a2-b66f848ceb6d Address:127.0.0.1:17734}]
2019/12/30 18:52:57 [INFO]  raft: Node at 127.0.0.1:17734 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:57.392230 [INFO] serf: EventMemberJoin: Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d.dc1 127.0.0.1
jones - 2019/12/30 18:52:57.396024 [INFO] serf: EventMemberJoin: Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d 127.0.0.1
jones - 2019/12/30 18:52:57.396897 [INFO] consul: Handled member-join event for server "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d.dc1" in area "wan"
jones - 2019/12/30 18:52:57.396911 [INFO] consul: Adding LAN server Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d (Addr: tcp/127.0.0.1:17734) (DC: dc1)
jones - 2019/12/30 18:52:57.397674 [INFO] agent: Started DNS server 127.0.0.1:17729 (udp)
jones - 2019/12/30 18:52:57.397751 [INFO] agent: Started DNS server 127.0.0.1:17729 (tcp)
jones - 2019/12/30 18:52:57.400197 [INFO] agent: Started HTTP server on 127.0.0.1:17730 (tcp)
jones - 2019/12/30 18:52:57.400291 [INFO] agent: started state syncer
2019/12/30 18:52:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:57 [INFO]  raft: Node at 127.0.0.1:17734 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:57 [INFO]  raft: Node at 127.0.0.1:17734 [Leader] entering Leader state
jones - 2019/12/30 18:52:57.863688 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:52:57.864108 [INFO] consul: New leader elected: Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d
=== RUN   TestAgent_sidecarServiceFromNodeService/re-registering_same_sidecar_with_no_port_should_pick_same_one
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/30 18:52:58.032531 [WARN] agent: Node name "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/30 18:52:58.032923 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/30 18:52:58.035486 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:58.155907 [INFO] agent: Synced node info
jones - 2019/12/30 18:52:58.230401 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:52:59.205101 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:52:59.205179 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:52:59.205251 [DEBUG] agent: Node info in sync
2019/12/30 18:52:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f632792c-c81a-fbfb-b7c4-e99bdb454ade Address:127.0.0.1:17740}]
2019/12/30 18:52:59 [INFO]  raft: Node at 127.0.0.1:17740 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:52:59.509333 [INFO] serf: EventMemberJoin: Node f632792c-c81a-fbfb-b7c4-e99bdb454ade.dc1 127.0.0.1
jones - 2019/12/30 18:52:59.512734 [INFO] serf: EventMemberJoin: Node f632792c-c81a-fbfb-b7c4-e99bdb454ade 127.0.0.1
jones - 2019/12/30 18:52:59.513431 [INFO] consul: Adding LAN server Node f632792c-c81a-fbfb-b7c4-e99bdb454ade (Addr: tcp/127.0.0.1:17740) (DC: dc1)
jones - 2019/12/30 18:52:59.513445 [INFO] consul: Handled member-join event for server "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade.dc1" in area "wan"
jones - 2019/12/30 18:52:59.514109 [INFO] agent: Started DNS server 127.0.0.1:17735 (tcp)
jones - 2019/12/30 18:52:59.514183 [INFO] agent: Started DNS server 127.0.0.1:17735 (udp)
jones - 2019/12/30 18:52:59.517110 [INFO] agent: Started HTTP server on 127.0.0.1:17736 (tcp)
jones - 2019/12/30 18:52:59.517256 [INFO] agent: started state syncer
2019/12/30 18:52:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:59 [INFO]  raft: Node at 127.0.0.1:17740 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:52:59.648017 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:52:59.648567 [DEBUG] consul: Skipping self join check for "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" since the cluster is too small
jones - 2019/12/30 18:52:59.648731 [INFO] consul: member 'Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d' joined, marking health alive
2019/12/30 18:53:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:00 [INFO]  raft: Node at 127.0.0.1:17740 [Leader] entering Leader state
jones - 2019/12/30 18:53:00.149034 [INFO] consul: cluster leadership acquired
jones - 2019/12/30 18:53:00.149585 [INFO] consul: New leader elected: Node f632792c-c81a-fbfb-b7c4-e99bdb454ade
jones - 2019/12/30 18:53:00.357772 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:53:01.027343 [INFO] agent: Synced node info
--- PASS: TestAgent_sidecarServiceFromNodeService (21.85s)
jones - 2019/12/30 18:53:01.033254 [ERR] leaf watch error: invalid type for leaf response: <nil>
    --- PASS: TestAgent_sidecarServiceFromNodeService/no_sidecar (1.58s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/all_the_defaults (2.26s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/all_the_allowed_overrides (2.18s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/no_auto_ports_available (1.95s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/auto_ports_disabled (2.53s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/inherit_tags_and_meta (3.36s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/invalid_check_type (3.19s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/invalid_meta (1.75s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/re-registering_same_sidecar_with_no_port_should_pick_same_one (3.06s)
=== RUN   TestSnapshot
--- SKIP: TestSnapshot (0.00s)
    snapshot_endpoint_test.go:16: DM-skipped
=== RUN   TestSnapshot_Options
=== PAUSE TestSnapshot_Options
=== RUN   TestStatusLeader
--- SKIP: TestStatusLeader (0.00s)
    status_endpoint_test.go:11: DM-skipped
=== RUN   TestStatusPeers
=== PAUSE TestStatusPeers
=== RUN   TestDefaultConfig
=== RUN   TestDefaultConfig/#00
=== PAUSE TestDefaultConfig/#00
=== RUN   TestDefaultConfig/#01
=== PAUSE TestDefaultConfig/#01
=== RUN   TestDefaultConfig/#02
=== PAUSE TestDefaultConfig/#02
=== RUN   TestDefaultConfig/#03
=== PAUSE TestDefaultConfig/#03
=== RUN   TestDefaultConfig/#04
=== PAUSE TestDefaultConfig/#04
=== RUN   TestDefaultConfig/#05
=== PAUSE TestDefaultConfig/#05
=== RUN   TestDefaultConfig/#06
=== PAUSE TestDefaultConfig/#06
=== RUN   TestDefaultConfig/#07
=== PAUSE TestDefaultConfig/#07
=== RUN   TestDefaultConfig/#08
=== PAUSE TestDefaultConfig/#08
=== RUN   TestDefaultConfig/#09
=== PAUSE TestDefaultConfig/#09
=== RUN   TestDefaultConfig/#10
=== PAUSE TestDefaultConfig/#10
=== RUN   TestDefaultConfig/#11
=== PAUSE TestDefaultConfig/#11
=== RUN   TestDefaultConfig/#12
=== PAUSE TestDefaultConfig/#12
=== RUN   TestDefaultConfig/#13
=== PAUSE TestDefaultConfig/#13
=== RUN   TestDefaultConfig/#14
=== PAUSE TestDefaultConfig/#14
=== RUN   TestDefaultConfig/#15
=== PAUSE TestDefaultConfig/#15
=== RUN   TestDefaultConfig/#16
=== PAUSE TestDefaultConfig/#16
=== RUN   TestDefaultConfig/#17
=== PAUSE TestDefaultConfig/#17
=== RUN   TestDefaultConfig/#18
=== PAUSE TestDefaultConfig/#18
=== RUN   TestDefaultConfig/#19
=== PAUSE TestDefaultConfig/#19
=== RUN   TestDefaultConfig/#20
=== PAUSE TestDefaultConfig/#20
=== RUN   TestDefaultConfig/#21
=== PAUSE TestDefaultConfig/#21
=== RUN   TestDefaultConfig/#22
=== PAUSE TestDefaultConfig/#22
=== RUN   TestDefaultConfig/#23
=== PAUSE TestDefaultConfig/#23
=== RUN   TestDefaultConfig/#24
=== PAUSE TestDefaultConfig/#24
=== RUN   TestDefaultConfig/#25
=== PAUSE TestDefaultConfig/#25
=== RUN   TestDefaultConfig/#26
=== PAUSE TestDefaultConfig/#26
=== RUN   TestDefaultConfig/#27
=== PAUSE TestDefaultConfig/#27
=== RUN   TestDefaultConfig/#28
=== PAUSE TestDefaultConfig/#28
=== RUN   TestDefaultConfig/#29
=== PAUSE TestDefaultConfig/#29
=== RUN   TestDefaultConfig/#30
=== PAUSE TestDefaultConfig/#30
=== RUN   TestDefaultConfig/#31
=== PAUSE TestDefaultConfig/#31
=== RUN   TestDefaultConfig/#32
=== PAUSE TestDefaultConfig/#32
=== RUN   TestDefaultConfig/#33
=== PAUSE TestDefaultConfig/#33
=== RUN   TestDefaultConfig/#34
=== PAUSE TestDefaultConfig/#34
=== RUN   TestDefaultConfig/#35
=== PAUSE TestDefaultConfig/#35
=== RUN   TestDefaultConfig/#36
=== PAUSE TestDefaultConfig/#36
=== RUN   TestDefaultConfig/#37
=== PAUSE TestDefaultConfig/#37
=== RUN   TestDefaultConfig/#38
=== PAUSE TestDefaultConfig/#38
=== RUN   TestDefaultConfig/#39
=== PAUSE TestDefaultConfig/#39
=== RUN   TestDefaultConfig/#40
=== PAUSE TestDefaultConfig/#40
=== RUN   TestDefaultConfig/#41
=== PAUSE TestDefaultConfig/#41
=== RUN   TestDefaultConfig/#42
=== PAUSE TestDefaultConfig/#42
=== RUN   TestDefaultConfig/#43
=== PAUSE TestDefaultConfig/#43
=== RUN   TestDefaultConfig/#44
=== PAUSE TestDefaultConfig/#44
=== RUN   TestDefaultConfig/#45
=== PAUSE TestDefaultConfig/#45
=== RUN   TestDefaultConfig/#46
=== PAUSE TestDefaultConfig/#46
=== RUN   TestDefaultConfig/#47
=== PAUSE TestDefaultConfig/#47
=== RUN   TestDefaultConfig/#48
=== PAUSE TestDefaultConfig/#48
=== RUN   TestDefaultConfig/#49
=== PAUSE TestDefaultConfig/#49
=== RUN   TestDefaultConfig/#50
=== PAUSE TestDefaultConfig/#50
=== RUN   TestDefaultConfig/#51
=== PAUSE TestDefaultConfig/#51
=== RUN   TestDefaultConfig/#52
=== PAUSE TestDefaultConfig/#52
=== RUN   TestDefaultConfig/#53
=== PAUSE TestDefaultConfig/#53
=== RUN   TestDefaultConfig/#54
=== PAUSE TestDefaultConfig/#54
=== RUN   TestDefaultConfig/#55
=== PAUSE TestDefaultConfig/#55
=== RUN   TestDefaultConfig/#56
=== PAUSE TestDefaultConfig/#56
=== RUN   TestDefaultConfig/#57
=== PAUSE TestDefaultConfig/#57
=== RUN   TestDefaultConfig/#58
=== PAUSE TestDefaultConfig/#58
=== RUN   TestDefaultConfig/#59
=== PAUSE TestDefaultConfig/#59
=== RUN   TestDefaultConfig/#60
=== PAUSE TestDefaultConfig/#60
=== RUN   TestDefaultConfig/#61
=== PAUSE TestDefaultConfig/#61
=== RUN   TestDefaultConfig/#62
=== PAUSE TestDefaultConfig/#62
=== RUN   TestDefaultConfig/#63
=== PAUSE TestDefaultConfig/#63
=== RUN   TestDefaultConfig/#64
=== PAUSE TestDefaultConfig/#64
=== RUN   TestDefaultConfig/#65
=== PAUSE TestDefaultConfig/#65
=== RUN   TestDefaultConfig/#66
=== PAUSE TestDefaultConfig/#66
=== RUN   TestDefaultConfig/#67
=== PAUSE TestDefaultConfig/#67
=== RUN   TestDefaultConfig/#68
=== PAUSE TestDefaultConfig/#68
=== RUN   TestDefaultConfig/#69
=== PAUSE TestDefaultConfig/#69
=== RUN   TestDefaultConfig/#70
=== PAUSE TestDefaultConfig/#70
=== RUN   TestDefaultConfig/#71
=== PAUSE TestDefaultConfig/#71
=== RUN   TestDefaultConfig/#72
=== PAUSE TestDefaultConfig/#72
=== RUN   TestDefaultConfig/#73
=== PAUSE TestDefaultConfig/#73
=== RUN   TestDefaultConfig/#74
=== PAUSE TestDefaultConfig/#74
=== RUN   TestDefaultConfig/#75
=== PAUSE TestDefaultConfig/#75
=== RUN   TestDefaultConfig/#76
=== PAUSE TestDefaultConfig/#76
=== RUN   TestDefaultConfig/#77
=== PAUSE TestDefaultConfig/#77
=== RUN   TestDefaultConfig/#78
=== PAUSE TestDefaultConfig/#78
=== RUN   TestDefaultConfig/#79
=== PAUSE TestDefaultConfig/#79
=== RUN   TestDefaultConfig/#80
=== PAUSE TestDefaultConfig/#80
=== RUN   TestDefaultConfig/#81
=== PAUSE TestDefaultConfig/#81
=== RUN   TestDefaultConfig/#82
=== PAUSE TestDefaultConfig/#82
=== RUN   TestDefaultConfig/#83
=== PAUSE TestDefaultConfig/#83
=== RUN   TestDefaultConfig/#84
=== PAUSE TestDefaultConfig/#84
=== RUN   TestDefaultConfig/#85
=== PAUSE TestDefaultConfig/#85
=== RUN   TestDefaultConfig/#86
=== PAUSE TestDefaultConfig/#86
=== RUN   TestDefaultConfig/#87
=== PAUSE TestDefaultConfig/#87
=== RUN   TestDefaultConfig/#88
=== PAUSE TestDefaultConfig/#88
=== RUN   TestDefaultConfig/#89
=== PAUSE TestDefaultConfig/#89
=== RUN   TestDefaultConfig/#90
=== PAUSE TestDefaultConfig/#90
=== RUN   TestDefaultConfig/#91
=== PAUSE TestDefaultConfig/#91
=== RUN   TestDefaultConfig/#92
=== PAUSE TestDefaultConfig/#92
=== RUN   TestDefaultConfig/#93
=== PAUSE TestDefaultConfig/#93
=== RUN   TestDefaultConfig/#94
=== PAUSE TestDefaultConfig/#94
=== RUN   TestDefaultConfig/#95
=== PAUSE TestDefaultConfig/#95
=== RUN   TestDefaultConfig/#96
=== PAUSE TestDefaultConfig/#96
=== RUN   TestDefaultConfig/#97
=== PAUSE TestDefaultConfig/#97
=== RUN   TestDefaultConfig/#98
=== PAUSE TestDefaultConfig/#98
=== RUN   TestDefaultConfig/#99
=== PAUSE TestDefaultConfig/#99
=== RUN   TestDefaultConfig/#100
=== PAUSE TestDefaultConfig/#100
=== RUN   TestDefaultConfig/#101
=== PAUSE TestDefaultConfig/#101
=== RUN   TestDefaultConfig/#102
=== PAUSE TestDefaultConfig/#102
=== RUN   TestDefaultConfig/#103
=== PAUSE TestDefaultConfig/#103
=== RUN   TestDefaultConfig/#104
=== PAUSE TestDefaultConfig/#104
=== RUN   TestDefaultConfig/#105
=== PAUSE TestDefaultConfig/#105
=== RUN   TestDefaultConfig/#106
=== PAUSE TestDefaultConfig/#106
=== RUN   TestDefaultConfig/#107
=== PAUSE TestDefaultConfig/#107
=== RUN   TestDefaultConfig/#108
=== PAUSE TestDefaultConfig/#108
=== RUN   TestDefaultConfig/#109
=== PAUSE TestDefaultConfig/#109
=== RUN   TestDefaultConfig/#110
=== PAUSE TestDefaultConfig/#110
=== RUN   TestDefaultConfig/#111
=== PAUSE TestDefaultConfig/#111
=== RUN   TestDefaultConfig/#112
=== PAUSE TestDefaultConfig/#112
=== RUN   TestDefaultConfig/#113
=== PAUSE TestDefaultConfig/#113
=== RUN   TestDefaultConfig/#114
=== PAUSE TestDefaultConfig/#114
=== RUN   TestDefaultConfig/#115
=== PAUSE TestDefaultConfig/#115
=== RUN   TestDefaultConfig/#116
=== PAUSE TestDefaultConfig/#116
=== RUN   TestDefaultConfig/#117
=== PAUSE TestDefaultConfig/#117
=== RUN   TestDefaultConfig/#118
=== PAUSE TestDefaultConfig/#118
=== RUN   TestDefaultConfig/#119
=== PAUSE TestDefaultConfig/#119
=== RUN   TestDefaultConfig/#120
=== PAUSE TestDefaultConfig/#120
=== RUN   TestDefaultConfig/#121
=== PAUSE TestDefaultConfig/#121
=== RUN   TestDefaultConfig/#122
=== PAUSE TestDefaultConfig/#122
=== RUN   TestDefaultConfig/#123
=== PAUSE TestDefaultConfig/#123
=== RUN   TestDefaultConfig/#124
=== PAUSE TestDefaultConfig/#124
=== RUN   TestDefaultConfig/#125
=== PAUSE TestDefaultConfig/#125
=== RUN   TestDefaultConfig/#126
=== PAUSE TestDefaultConfig/#126
=== RUN   TestDefaultConfig/#127
=== PAUSE TestDefaultConfig/#127
=== RUN   TestDefaultConfig/#128
=== PAUSE TestDefaultConfig/#128
=== RUN   TestDefaultConfig/#129
=== PAUSE TestDefaultConfig/#129
=== RUN   TestDefaultConfig/#130
=== PAUSE TestDefaultConfig/#130
=== RUN   TestDefaultConfig/#131
=== PAUSE TestDefaultConfig/#131
=== RUN   TestDefaultConfig/#132
=== PAUSE TestDefaultConfig/#132
=== RUN   TestDefaultConfig/#133
=== PAUSE TestDefaultConfig/#133
=== RUN   TestDefaultConfig/#134
=== PAUSE TestDefaultConfig/#134
=== RUN   TestDefaultConfig/#135
=== PAUSE TestDefaultConfig/#135
=== RUN   TestDefaultConfig/#136
=== PAUSE TestDefaultConfig/#136
=== RUN   TestDefaultConfig/#137
=== PAUSE TestDefaultConfig/#137
=== RUN   TestDefaultConfig/#138
=== PAUSE TestDefaultConfig/#138
=== RUN   TestDefaultConfig/#139
=== PAUSE TestDefaultConfig/#139
=== RUN   TestDefaultConfig/#140
=== PAUSE TestDefaultConfig/#140
=== RUN   TestDefaultConfig/#141
=== PAUSE TestDefaultConfig/#141
=== RUN   TestDefaultConfig/#142
=== PAUSE TestDefaultConfig/#142
=== RUN   TestDefaultConfig/#143
=== PAUSE TestDefaultConfig/#143
=== RUN   TestDefaultConfig/#144
=== PAUSE TestDefaultConfig/#144
=== RUN   TestDefaultConfig/#145
=== PAUSE TestDefaultConfig/#145
=== RUN   TestDefaultConfig/#146
=== PAUSE TestDefaultConfig/#146
=== RUN   TestDefaultConfig/#147
=== PAUSE TestDefaultConfig/#147
=== RUN   TestDefaultConfig/#148
=== PAUSE TestDefaultConfig/#148
=== RUN   TestDefaultConfig/#149
=== PAUSE TestDefaultConfig/#149
=== RUN   TestDefaultConfig/#150
=== PAUSE TestDefaultConfig/#150
=== RUN   TestDefaultConfig/#151
=== PAUSE TestDefaultConfig/#151
=== RUN   TestDefaultConfig/#152
=== PAUSE TestDefaultConfig/#152
=== RUN   TestDefaultConfig/#153
=== PAUSE TestDefaultConfig/#153
jones - 2019/12/30 18:53:01.075504 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
=== RUN   TestDefaultConfig/#154
jones - 2019/12/30 18:53:01.075562 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:53:01.075628 [DEBUG] agent: Node info in sync
=== PAUSE TestDefaultConfig/#154
=== RUN   TestDefaultConfig/#155
=== PAUSE TestDefaultConfig/#155
=== RUN   TestDefaultConfig/#156
=== PAUSE TestDefaultConfig/#156
=== RUN   TestDefaultConfig/#157
=== PAUSE TestDefaultConfig/#157
=== RUN   TestDefaultConfig/#158
=== PAUSE TestDefaultConfig/#158
=== RUN   TestDefaultConfig/#159
=== PAUSE TestDefaultConfig/#159
=== RUN   TestDefaultConfig/#160
=== PAUSE TestDefaultConfig/#160
=== RUN   TestDefaultConfig/#161
=== PAUSE TestDefaultConfig/#161
=== RUN   TestDefaultConfig/#162
=== PAUSE TestDefaultConfig/#162
=== RUN   TestDefaultConfig/#163
=== PAUSE TestDefaultConfig/#163
=== RUN   TestDefaultConfig/#164
=== PAUSE TestDefaultConfig/#164
=== RUN   TestDefaultConfig/#165
=== PAUSE TestDefaultConfig/#165
=== RUN   TestDefaultConfig/#166
=== PAUSE TestDefaultConfig/#166
=== RUN   TestDefaultConfig/#167
=== PAUSE TestDefaultConfig/#167
=== RUN   TestDefaultConfig/#168
=== PAUSE TestDefaultConfig/#168
=== RUN   TestDefaultConfig/#169
=== PAUSE TestDefaultConfig/#169
=== RUN   TestDefaultConfig/#170
=== PAUSE TestDefaultConfig/#170
=== RUN   TestDefaultConfig/#171
=== PAUSE TestDefaultConfig/#171
=== RUN   TestDefaultConfig/#172
=== PAUSE TestDefaultConfig/#172
=== RUN   TestDefaultConfig/#173
=== PAUSE TestDefaultConfig/#173
=== RUN   TestDefaultConfig/#174
=== PAUSE TestDefaultConfig/#174
=== RUN   TestDefaultConfig/#175
=== PAUSE TestDefaultConfig/#175
=== RUN   TestDefaultConfig/#176
=== PAUSE TestDefaultConfig/#176
=== RUN   TestDefaultConfig/#177
=== PAUSE TestDefaultConfig/#177
=== RUN   TestDefaultConfig/#178
=== PAUSE TestDefaultConfig/#178
=== RUN   TestDefaultConfig/#179
=== PAUSE TestDefaultConfig/#179
=== RUN   TestDefaultConfig/#180
=== PAUSE TestDefaultConfig/#180
=== RUN   TestDefaultConfig/#181
=== PAUSE TestDefaultConfig/#181
=== RUN   TestDefaultConfig/#182
=== PAUSE TestDefaultConfig/#182
=== RUN   TestDefaultConfig/#183
=== PAUSE TestDefaultConfig/#183
=== RUN   TestDefaultConfig/#184
=== PAUSE TestDefaultConfig/#184
=== RUN   TestDefaultConfig/#185
=== PAUSE TestDefaultConfig/#185
=== RUN   TestDefaultConfig/#186
=== PAUSE TestDefaultConfig/#186
=== RUN   TestDefaultConfig/#187
=== PAUSE TestDefaultConfig/#187
=== RUN   TestDefaultConfig/#188
=== PAUSE TestDefaultConfig/#188
=== RUN   TestDefaultConfig/#189
=== PAUSE TestDefaultConfig/#189
=== RUN   TestDefaultConfig/#190
=== PAUSE TestDefaultConfig/#190
=== RUN   TestDefaultConfig/#191
=== PAUSE TestDefaultConfig/#191
=== RUN   TestDefaultConfig/#192
=== PAUSE TestDefaultConfig/#192
=== RUN   TestDefaultConfig/#193
=== PAUSE TestDefaultConfig/#193
=== RUN   TestDefaultConfig/#194
=== PAUSE TestDefaultConfig/#194
=== RUN   TestDefaultConfig/#195
=== PAUSE TestDefaultConfig/#195
=== RUN   TestDefaultConfig/#196
=== PAUSE TestDefaultConfig/#196
=== RUN   TestDefaultConfig/#197
=== PAUSE TestDefaultConfig/#197
=== RUN   TestDefaultConfig/#198
=== PAUSE TestDefaultConfig/#198
=== RUN   TestDefaultConfig/#199
=== PAUSE TestDefaultConfig/#199
=== RUN   TestDefaultConfig/#200
=== PAUSE TestDefaultConfig/#200
=== RUN   TestDefaultConfig/#201
=== PAUSE TestDefaultConfig/#201
=== RUN   TestDefaultConfig/#202
=== PAUSE TestDefaultConfig/#202
=== RUN   TestDefaultConfig/#203
=== PAUSE TestDefaultConfig/#203
=== RUN   TestDefaultConfig/#204
=== PAUSE TestDefaultConfig/#204
=== RUN   TestDefaultConfig/#205
=== PAUSE TestDefaultConfig/#205
=== RUN   TestDefaultConfig/#206
=== PAUSE TestDefaultConfig/#206
=== RUN   TestDefaultConfig/#207
=== PAUSE TestDefaultConfig/#207
=== RUN   TestDefaultConfig/#208
=== PAUSE TestDefaultConfig/#208
=== RUN   TestDefaultConfig/#209
=== PAUSE TestDefaultConfig/#209
=== RUN   TestDefaultConfig/#210
=== PAUSE TestDefaultConfig/#210
=== RUN   TestDefaultConfig/#211
=== PAUSE TestDefaultConfig/#211
=== RUN   TestDefaultConfig/#212
=== PAUSE TestDefaultConfig/#212
=== RUN   TestDefaultConfig/#213
=== PAUSE TestDefaultConfig/#213
=== RUN   TestDefaultConfig/#214
=== PAUSE TestDefaultConfig/#214
=== RUN   TestDefaultConfig/#215
=== PAUSE TestDefaultConfig/#215
=== RUN   TestDefaultConfig/#216
=== PAUSE TestDefaultConfig/#216
=== RUN   TestDefaultConfig/#217
=== PAUSE TestDefaultConfig/#217
=== RUN   TestDefaultConfig/#218
=== PAUSE TestDefaultConfig/#218
=== RUN   TestDefaultConfig/#219
=== PAUSE TestDefaultConfig/#219
=== RUN   TestDefaultConfig/#220
=== PAUSE TestDefaultConfig/#220
=== RUN   TestDefaultConfig/#221
=== PAUSE TestDefaultConfig/#221
=== RUN   TestDefaultConfig/#222
=== PAUSE TestDefaultConfig/#222
=== RUN   TestDefaultConfig/#223
=== PAUSE TestDefaultConfig/#223
=== RUN   TestDefaultConfig/#224
=== PAUSE TestDefaultConfig/#224
=== RUN   TestDefaultConfig/#225
=== PAUSE TestDefaultConfig/#225
=== RUN   TestDefaultConfig/#226
=== PAUSE TestDefaultConfig/#226
=== RUN   TestDefaultConfig/#227
=== PAUSE TestDefaultConfig/#227
=== RUN   TestDefaultConfig/#228
=== PAUSE TestDefaultConfig/#228
=== RUN   TestDefaultConfig/#229
=== PAUSE TestDefaultConfig/#229
=== RUN   TestDefaultConfig/#230
=== PAUSE TestDefaultConfig/#230
=== RUN   TestDefaultConfig/#231
=== PAUSE TestDefaultConfig/#231
=== RUN   TestDefaultConfig/#232
=== PAUSE TestDefaultConfig/#232
=== RUN   TestDefaultConfig/#233
=== PAUSE TestDefaultConfig/#233
=== RUN   TestDefaultConfig/#234
=== PAUSE TestDefaultConfig/#234
=== RUN   TestDefaultConfig/#235
=== PAUSE TestDefaultConfig/#235
=== RUN   TestDefaultConfig/#236
=== PAUSE TestDefaultConfig/#236
=== RUN   TestDefaultConfig/#237
=== PAUSE TestDefaultConfig/#237
=== RUN   TestDefaultConfig/#238
=== PAUSE TestDefaultConfig/#238
=== RUN   TestDefaultConfig/#239
=== PAUSE TestDefaultConfig/#239
=== RUN   TestDefaultConfig/#240
=== PAUSE TestDefaultConfig/#240
=== RUN   TestDefaultConfig/#241
=== PAUSE TestDefaultConfig/#241
=== RUN   TestDefaultConfig/#242
=== PAUSE TestDefaultConfig/#242
=== RUN   TestDefaultConfig/#243
=== PAUSE TestDefaultConfig/#243
=== RUN   TestDefaultConfig/#244
=== PAUSE TestDefaultConfig/#244
=== RUN   TestDefaultConfig/#245
=== PAUSE TestDefaultConfig/#245
=== RUN   TestDefaultConfig/#246
=== PAUSE TestDefaultConfig/#246
=== RUN   TestDefaultConfig/#247
=== PAUSE TestDefaultConfig/#247
=== RUN   TestDefaultConfig/#248
=== PAUSE TestDefaultConfig/#248
=== RUN   TestDefaultConfig/#249
=== PAUSE TestDefaultConfig/#249
=== RUN   TestDefaultConfig/#250
=== PAUSE TestDefaultConfig/#250
=== RUN   TestDefaultConfig/#251
=== PAUSE TestDefaultConfig/#251
=== RUN   TestDefaultConfig/#252
=== PAUSE TestDefaultConfig/#252
=== RUN   TestDefaultConfig/#253
=== PAUSE TestDefaultConfig/#253
=== RUN   TestDefaultConfig/#254
=== PAUSE TestDefaultConfig/#254
=== RUN   TestDefaultConfig/#255
=== PAUSE TestDefaultConfig/#255
=== RUN   TestDefaultConfig/#256
=== PAUSE TestDefaultConfig/#256
=== RUN   TestDefaultConfig/#257
=== PAUSE TestDefaultConfig/#257
=== RUN   TestDefaultConfig/#258
=== PAUSE TestDefaultConfig/#258
=== RUN   TestDefaultConfig/#259
=== PAUSE TestDefaultConfig/#259
=== RUN   TestDefaultConfig/#260
=== PAUSE TestDefaultConfig/#260
=== RUN   TestDefaultConfig/#261
=== PAUSE TestDefaultConfig/#261
=== RUN   TestDefaultConfig/#262
=== PAUSE TestDefaultConfig/#262
=== RUN   TestDefaultConfig/#263
=== PAUSE TestDefaultConfig/#263
=== RUN   TestDefaultConfig/#264
=== PAUSE TestDefaultConfig/#264
=== RUN   TestDefaultConfig/#265
=== PAUSE TestDefaultConfig/#265
=== RUN   TestDefaultConfig/#266
=== PAUSE TestDefaultConfig/#266
=== RUN   TestDefaultConfig/#267
=== PAUSE TestDefaultConfig/#267
=== RUN   TestDefaultConfig/#268
=== PAUSE TestDefaultConfig/#268
=== RUN   TestDefaultConfig/#269
=== PAUSE TestDefaultConfig/#269
=== RUN   TestDefaultConfig/#270
=== PAUSE TestDefaultConfig/#270
=== RUN   TestDefaultConfig/#271
=== PAUSE TestDefaultConfig/#271
=== RUN   TestDefaultConfig/#272
=== PAUSE TestDefaultConfig/#272
=== RUN   TestDefaultConfig/#273
=== PAUSE TestDefaultConfig/#273
=== RUN   TestDefaultConfig/#274
=== PAUSE TestDefaultConfig/#274
=== RUN   TestDefaultConfig/#275
=== PAUSE TestDefaultConfig/#275
=== RUN   TestDefaultConfig/#276
=== PAUSE TestDefaultConfig/#276
=== RUN   TestDefaultConfig/#277
=== PAUSE TestDefaultConfig/#277
=== RUN   TestDefaultConfig/#278
=== PAUSE TestDefaultConfig/#278
=== RUN   TestDefaultConfig/#279
=== PAUSE TestDefaultConfig/#279
=== RUN   TestDefaultConfig/#280
=== PAUSE TestDefaultConfig/#280
=== RUN   TestDefaultConfig/#281
=== PAUSE TestDefaultConfig/#281
=== RUN   TestDefaultConfig/#282
=== PAUSE TestDefaultConfig/#282
=== RUN   TestDefaultConfig/#283
=== PAUSE TestDefaultConfig/#283
=== RUN   TestDefaultConfig/#284
=== PAUSE TestDefaultConfig/#284
=== RUN   TestDefaultConfig/#285
=== PAUSE TestDefaultConfig/#285
=== RUN   TestDefaultConfig/#286
=== PAUSE TestDefaultConfig/#286
=== RUN   TestDefaultConfig/#287
=== PAUSE TestDefaultConfig/#287
=== RUN   TestDefaultConfig/#288
=== PAUSE TestDefaultConfig/#288
=== RUN   TestDefaultConfig/#289
=== PAUSE TestDefaultConfig/#289
=== RUN   TestDefaultConfig/#290
=== PAUSE TestDefaultConfig/#290
=== RUN   TestDefaultConfig/#291
=== PAUSE TestDefaultConfig/#291
=== RUN   TestDefaultConfig/#292
=== PAUSE TestDefaultConfig/#292
=== RUN   TestDefaultConfig/#293
=== PAUSE TestDefaultConfig/#293
=== RUN   TestDefaultConfig/#294
=== PAUSE TestDefaultConfig/#294
=== RUN   TestDefaultConfig/#295
=== PAUSE TestDefaultConfig/#295
=== RUN   TestDefaultConfig/#296
=== PAUSE TestDefaultConfig/#296
=== RUN   TestDefaultConfig/#297
=== PAUSE TestDefaultConfig/#297
=== RUN   TestDefaultConfig/#298
=== PAUSE TestDefaultConfig/#298
=== RUN   TestDefaultConfig/#299
=== PAUSE TestDefaultConfig/#299
=== RUN   TestDefaultConfig/#300
=== PAUSE TestDefaultConfig/#300
=== RUN   TestDefaultConfig/#301
=== PAUSE TestDefaultConfig/#301
=== RUN   TestDefaultConfig/#302
=== PAUSE TestDefaultConfig/#302
=== RUN   TestDefaultConfig/#303
=== PAUSE TestDefaultConfig/#303
=== RUN   TestDefaultConfig/#304
=== PAUSE TestDefaultConfig/#304
=== RUN   TestDefaultConfig/#305
=== PAUSE TestDefaultConfig/#305
=== RUN   TestDefaultConfig/#306
=== PAUSE TestDefaultConfig/#306
=== RUN   TestDefaultConfig/#307
=== PAUSE TestDefaultConfig/#307
=== RUN   TestDefaultConfig/#308
=== PAUSE TestDefaultConfig/#308
=== RUN   TestDefaultConfig/#309
=== PAUSE TestDefaultConfig/#309
=== RUN   TestDefaultConfig/#310
=== PAUSE TestDefaultConfig/#310
=== RUN   TestDefaultConfig/#311
=== PAUSE TestDefaultConfig/#311
=== RUN   TestDefaultConfig/#312
=== PAUSE TestDefaultConfig/#312
=== RUN   TestDefaultConfig/#313
=== PAUSE TestDefaultConfig/#313
=== RUN   TestDefaultConfig/#314
=== PAUSE TestDefaultConfig/#314
=== RUN   TestDefaultConfig/#315
=== PAUSE TestDefaultConfig/#315
=== RUN   TestDefaultConfig/#316
=== PAUSE TestDefaultConfig/#316
=== RUN   TestDefaultConfig/#317
=== PAUSE TestDefaultConfig/#317
=== RUN   TestDefaultConfig/#318
=== PAUSE TestDefaultConfig/#318
=== RUN   TestDefaultConfig/#319
=== PAUSE TestDefaultConfig/#319
=== RUN   TestDefaultConfig/#320
=== PAUSE TestDefaultConfig/#320
=== RUN   TestDefaultConfig/#321
=== PAUSE TestDefaultConfig/#321
=== RUN   TestDefaultConfig/#322
=== PAUSE TestDefaultConfig/#322
=== RUN   TestDefaultConfig/#323
=== PAUSE TestDefaultConfig/#323
=== RUN   TestDefaultConfig/#324
=== PAUSE TestDefaultConfig/#324
=== RUN   TestDefaultConfig/#325
=== PAUSE TestDefaultConfig/#325
=== RUN   TestDefaultConfig/#326
=== PAUSE TestDefaultConfig/#326
=== RUN   TestDefaultConfig/#327
=== PAUSE TestDefaultConfig/#327
=== RUN   TestDefaultConfig/#328
=== PAUSE TestDefaultConfig/#328
=== RUN   TestDefaultConfig/#329
=== PAUSE TestDefaultConfig/#329
=== RUN   TestDefaultConfig/#330
=== PAUSE TestDefaultConfig/#330
=== RUN   TestDefaultConfig/#331
=== PAUSE TestDefaultConfig/#331
=== RUN   TestDefaultConfig/#332
=== PAUSE TestDefaultConfig/#332
=== RUN   TestDefaultConfig/#333
=== PAUSE TestDefaultConfig/#333
=== RUN   TestDefaultConfig/#334
=== PAUSE TestDefaultConfig/#334
=== RUN   TestDefaultConfig/#335
=== PAUSE TestDefaultConfig/#335
=== RUN   TestDefaultConfig/#336
=== PAUSE TestDefaultConfig/#336
=== RUN   TestDefaultConfig/#337
=== PAUSE TestDefaultConfig/#337
=== RUN   TestDefaultConfig/#338
=== PAUSE TestDefaultConfig/#338
=== RUN   TestDefaultConfig/#339
=== PAUSE TestDefaultConfig/#339
=== RUN   TestDefaultConfig/#340
=== PAUSE TestDefaultConfig/#340
=== RUN   TestDefaultConfig/#341
=== PAUSE TestDefaultConfig/#341
=== RUN   TestDefaultConfig/#342
=== PAUSE TestDefaultConfig/#342
=== RUN   TestDefaultConfig/#343
=== PAUSE TestDefaultConfig/#343
=== RUN   TestDefaultConfig/#344
=== PAUSE TestDefaultConfig/#344
=== RUN   TestDefaultConfig/#345
=== PAUSE TestDefaultConfig/#345
=== RUN   TestDefaultConfig/#346
=== PAUSE TestDefaultConfig/#346
=== RUN   TestDefaultConfig/#347
=== PAUSE TestDefaultConfig/#347
=== RUN   TestDefaultConfig/#348
=== PAUSE TestDefaultConfig/#348
=== RUN   TestDefaultConfig/#349
=== PAUSE TestDefaultConfig/#349
=== RUN   TestDefaultConfig/#350
=== PAUSE TestDefaultConfig/#350
=== RUN   TestDefaultConfig/#351
=== PAUSE TestDefaultConfig/#351
=== RUN   TestDefaultConfig/#352
=== PAUSE TestDefaultConfig/#352
=== RUN   TestDefaultConfig/#353
=== PAUSE TestDefaultConfig/#353
=== RUN   TestDefaultConfig/#354
=== PAUSE TestDefaultConfig/#354
=== RUN   TestDefaultConfig/#355
=== PAUSE TestDefaultConfig/#355
=== RUN   TestDefaultConfig/#356
=== PAUSE TestDefaultConfig/#356
=== RUN   TestDefaultConfig/#357
=== PAUSE TestDefaultConfig/#357
=== RUN   TestDefaultConfig/#358
=== PAUSE TestDefaultConfig/#358
=== RUN   TestDefaultConfig/#359
=== PAUSE TestDefaultConfig/#359
=== RUN   TestDefaultConfig/#360
=== PAUSE TestDefaultConfig/#360
=== RUN   TestDefaultConfig/#361
=== PAUSE TestDefaultConfig/#361
=== RUN   TestDefaultConfig/#362
=== PAUSE TestDefaultConfig/#362
=== RUN   TestDefaultConfig/#363
=== PAUSE TestDefaultConfig/#363
=== RUN   TestDefaultConfig/#364
=== PAUSE TestDefaultConfig/#364
=== RUN   TestDefaultConfig/#365
=== PAUSE TestDefaultConfig/#365
=== RUN   TestDefaultConfig/#366
=== PAUSE TestDefaultConfig/#366
=== RUN   TestDefaultConfig/#367
=== PAUSE TestDefaultConfig/#367
=== RUN   TestDefaultConfig/#368
=== PAUSE TestDefaultConfig/#368
=== RUN   TestDefaultConfig/#369
=== PAUSE TestDefaultConfig/#369
=== RUN   TestDefaultConfig/#370
=== PAUSE TestDefaultConfig/#370
=== RUN   TestDefaultConfig/#371
=== PAUSE TestDefaultConfig/#371
=== RUN   TestDefaultConfig/#372
=== PAUSE TestDefaultConfig/#372
=== RUN   TestDefaultConfig/#373
=== PAUSE TestDefaultConfig/#373
=== RUN   TestDefaultConfig/#374
=== PAUSE TestDefaultConfig/#374
=== RUN   TestDefaultConfig/#375
=== PAUSE TestDefaultConfig/#375
=== RUN   TestDefaultConfig/#376
=== PAUSE TestDefaultConfig/#376
=== RUN   TestDefaultConfig/#377
=== PAUSE TestDefaultConfig/#377
=== RUN   TestDefaultConfig/#378
=== PAUSE TestDefaultConfig/#378
=== RUN   TestDefaultConfig/#379
=== PAUSE TestDefaultConfig/#379
=== RUN   TestDefaultConfig/#380
=== PAUSE TestDefaultConfig/#380
=== RUN   TestDefaultConfig/#381
=== PAUSE TestDefaultConfig/#381
=== RUN   TestDefaultConfig/#382
=== PAUSE TestDefaultConfig/#382
=== RUN   TestDefaultConfig/#383
=== PAUSE TestDefaultConfig/#383
=== RUN   TestDefaultConfig/#384
=== PAUSE TestDefaultConfig/#384
=== RUN   TestDefaultConfig/#385
=== PAUSE TestDefaultConfig/#385
=== RUN   TestDefaultConfig/#386
=== PAUSE TestDefaultConfig/#386
=== RUN   TestDefaultConfig/#387
=== PAUSE TestDefaultConfig/#387
=== RUN   TestDefaultConfig/#388
=== PAUSE TestDefaultConfig/#388
=== RUN   TestDefaultConfig/#389
=== PAUSE TestDefaultConfig/#389
=== RUN   TestDefaultConfig/#390
=== PAUSE TestDefaultConfig/#390
=== RUN   TestDefaultConfig/#391
=== PAUSE TestDefaultConfig/#391
=== RUN   TestDefaultConfig/#392
=== PAUSE TestDefaultConfig/#392
=== RUN   TestDefaultConfig/#393
=== PAUSE TestDefaultConfig/#393
=== RUN   TestDefaultConfig/#394
=== PAUSE TestDefaultConfig/#394
=== RUN   TestDefaultConfig/#395
=== PAUSE TestDefaultConfig/#395
=== RUN   TestDefaultConfig/#396
=== PAUSE TestDefaultConfig/#396
=== RUN   TestDefaultConfig/#397
=== PAUSE TestDefaultConfig/#397
=== RUN   TestDefaultConfig/#398
=== PAUSE TestDefaultConfig/#398
=== RUN   TestDefaultConfig/#399
=== PAUSE TestDefaultConfig/#399
=== RUN   TestDefaultConfig/#400
=== PAUSE TestDefaultConfig/#400
=== RUN   TestDefaultConfig/#401
=== PAUSE TestDefaultConfig/#401
=== RUN   TestDefaultConfig/#402
=== PAUSE TestDefaultConfig/#402
=== RUN   TestDefaultConfig/#403
=== PAUSE TestDefaultConfig/#403
=== RUN   TestDefaultConfig/#404
=== PAUSE TestDefaultConfig/#404
=== RUN   TestDefaultConfig/#405
=== PAUSE TestDefaultConfig/#405
=== RUN   TestDefaultConfig/#406
=== PAUSE TestDefaultConfig/#406
=== RUN   TestDefaultConfig/#407
=== PAUSE TestDefaultConfig/#407
=== RUN   TestDefaultConfig/#408
=== PAUSE TestDefaultConfig/#408
=== RUN   TestDefaultConfig/#409
=== PAUSE TestDefaultConfig/#409
=== RUN   TestDefaultConfig/#410
=== PAUSE TestDefaultConfig/#410
=== RUN   TestDefaultConfig/#411
=== PAUSE TestDefaultConfig/#411
=== RUN   TestDefaultConfig/#412
=== PAUSE TestDefaultConfig/#412
=== RUN   TestDefaultConfig/#413
=== PAUSE TestDefaultConfig/#413
=== RUN   TestDefaultConfig/#414
=== PAUSE TestDefaultConfig/#414
=== RUN   TestDefaultConfig/#415
=== PAUSE TestDefaultConfig/#415
=== RUN   TestDefaultConfig/#416
=== PAUSE TestDefaultConfig/#416
=== RUN   TestDefaultConfig/#417
=== PAUSE TestDefaultConfig/#417
=== RUN   TestDefaultConfig/#418
=== PAUSE TestDefaultConfig/#418
=== RUN   TestDefaultConfig/#419
=== PAUSE TestDefaultConfig/#419
=== RUN   TestDefaultConfig/#420
=== PAUSE TestDefaultConfig/#420
=== RUN   TestDefaultConfig/#421
=== PAUSE TestDefaultConfig/#421
=== RUN   TestDefaultConfig/#422
=== PAUSE TestDefaultConfig/#422
=== RUN   TestDefaultConfig/#423
=== PAUSE TestDefaultConfig/#423
=== RUN   TestDefaultConfig/#424
=== PAUSE TestDefaultConfig/#424
=== RUN   TestDefaultConfig/#425
=== PAUSE TestDefaultConfig/#425
=== RUN   TestDefaultConfig/#426
=== PAUSE TestDefaultConfig/#426
=== RUN   TestDefaultConfig/#427
=== PAUSE TestDefaultConfig/#427
=== RUN   TestDefaultConfig/#428
=== PAUSE TestDefaultConfig/#428
=== RUN   TestDefaultConfig/#429
=== PAUSE TestDefaultConfig/#429
=== RUN   TestDefaultConfig/#430
=== PAUSE TestDefaultConfig/#430
=== RUN   TestDefaultConfig/#431
=== PAUSE TestDefaultConfig/#431
=== RUN   TestDefaultConfig/#432
=== PAUSE TestDefaultConfig/#432
=== RUN   TestDefaultConfig/#433
=== PAUSE TestDefaultConfig/#433
=== RUN   TestDefaultConfig/#434
=== PAUSE TestDefaultConfig/#434
=== RUN   TestDefaultConfig/#435
=== PAUSE TestDefaultConfig/#435
=== RUN   TestDefaultConfig/#436
=== PAUSE TestDefaultConfig/#436
=== RUN   TestDefaultConfig/#437
=== PAUSE TestDefaultConfig/#437
=== RUN   TestDefaultConfig/#438
=== PAUSE TestDefaultConfig/#438
=== RUN   TestDefaultConfig/#439
=== PAUSE TestDefaultConfig/#439
=== RUN   TestDefaultConfig/#440
=== PAUSE TestDefaultConfig/#440
=== RUN   TestDefaultConfig/#441
=== PAUSE TestDefaultConfig/#441
=== RUN   TestDefaultConfig/#442
=== PAUSE TestDefaultConfig/#442
=== RUN   TestDefaultConfig/#443
=== PAUSE TestDefaultConfig/#443
=== RUN   TestDefaultConfig/#444
=== PAUSE TestDefaultConfig/#444
=== RUN   TestDefaultConfig/#445
=== PAUSE TestDefaultConfig/#445
=== RUN   TestDefaultConfig/#446
=== PAUSE TestDefaultConfig/#446
=== RUN   TestDefaultConfig/#447
=== PAUSE TestDefaultConfig/#447
=== RUN   TestDefaultConfig/#448
=== PAUSE TestDefaultConfig/#448
=== RUN   TestDefaultConfig/#449
=== PAUSE TestDefaultConfig/#449
=== RUN   TestDefaultConfig/#450
=== PAUSE TestDefaultConfig/#450
=== RUN   TestDefaultConfig/#451
=== PAUSE TestDefaultConfig/#451
=== RUN   TestDefaultConfig/#452
=== PAUSE TestDefaultConfig/#452
=== RUN   TestDefaultConfig/#453
=== PAUSE TestDefaultConfig/#453
=== RUN   TestDefaultConfig/#454
=== PAUSE TestDefaultConfig/#454
=== RUN   TestDefaultConfig/#455
=== PAUSE TestDefaultConfig/#455
=== RUN   TestDefaultConfig/#456
=== PAUSE TestDefaultConfig/#456
=== RUN   TestDefaultConfig/#457
=== PAUSE TestDefaultConfig/#457
=== RUN   TestDefaultConfig/#458
=== PAUSE TestDefaultConfig/#458
=== RUN   TestDefaultConfig/#459
=== PAUSE TestDefaultConfig/#459
=== RUN   TestDefaultConfig/#460
=== PAUSE TestDefaultConfig/#460
=== RUN   TestDefaultConfig/#461
=== PAUSE TestDefaultConfig/#461
=== RUN   TestDefaultConfig/#462
=== PAUSE TestDefaultConfig/#462
=== RUN   TestDefaultConfig/#463
=== PAUSE TestDefaultConfig/#463
=== RUN   TestDefaultConfig/#464
=== PAUSE TestDefaultConfig/#464
=== RUN   TestDefaultConfig/#465
=== PAUSE TestDefaultConfig/#465
=== RUN   TestDefaultConfig/#466
=== PAUSE TestDefaultConfig/#466
=== RUN   TestDefaultConfig/#467
=== PAUSE TestDefaultConfig/#467
=== RUN   TestDefaultConfig/#468
=== PAUSE TestDefaultConfig/#468
=== RUN   TestDefaultConfig/#469
=== PAUSE TestDefaultConfig/#469
=== RUN   TestDefaultConfig/#470
=== PAUSE TestDefaultConfig/#470
=== RUN   TestDefaultConfig/#471
=== PAUSE TestDefaultConfig/#471
=== RUN   TestDefaultConfig/#472
=== PAUSE TestDefaultConfig/#472
=== RUN   TestDefaultConfig/#473
=== PAUSE TestDefaultConfig/#473
=== RUN   TestDefaultConfig/#474
=== PAUSE TestDefaultConfig/#474
=== RUN   TestDefaultConfig/#475
=== PAUSE TestDefaultConfig/#475
=== RUN   TestDefaultConfig/#476
=== PAUSE TestDefaultConfig/#476
=== RUN   TestDefaultConfig/#477
=== PAUSE TestDefaultConfig/#477
=== RUN   TestDefaultConfig/#478
=== PAUSE TestDefaultConfig/#478
=== RUN   TestDefaultConfig/#479
=== PAUSE TestDefaultConfig/#479
=== RUN   TestDefaultConfig/#480
=== PAUSE TestDefaultConfig/#480
=== RUN   TestDefaultConfig/#481
=== PAUSE TestDefaultConfig/#481
=== RUN   TestDefaultConfig/#482
=== PAUSE TestDefaultConfig/#482
=== RUN   TestDefaultConfig/#483
=== PAUSE TestDefaultConfig/#483
=== RUN   TestDefaultConfig/#484
=== PAUSE TestDefaultConfig/#484
=== RUN   TestDefaultConfig/#485
=== PAUSE TestDefaultConfig/#485
=== RUN   TestDefaultConfig/#486
=== PAUSE TestDefaultConfig/#486
=== RUN   TestDefaultConfig/#487
=== PAUSE TestDefaultConfig/#487
=== RUN   TestDefaultConfig/#488
=== PAUSE TestDefaultConfig/#488
=== RUN   TestDefaultConfig/#489
=== PAUSE TestDefaultConfig/#489
=== RUN   TestDefaultConfig/#490
=== PAUSE TestDefaultConfig/#490
=== RUN   TestDefaultConfig/#491
=== PAUSE TestDefaultConfig/#491
=== RUN   TestDefaultConfig/#492
=== PAUSE TestDefaultConfig/#492
=== RUN   TestDefaultConfig/#493
=== PAUSE TestDefaultConfig/#493
=== RUN   TestDefaultConfig/#494
=== PAUSE TestDefaultConfig/#494
=== RUN   TestDefaultConfig/#495
=== PAUSE TestDefaultConfig/#495
=== RUN   TestDefaultConfig/#496
=== PAUSE TestDefaultConfig/#496
=== RUN   TestDefaultConfig/#497
=== PAUSE TestDefaultConfig/#497
=== RUN   TestDefaultConfig/#498
=== PAUSE TestDefaultConfig/#498
=== RUN   TestDefaultConfig/#499
=== PAUSE TestDefaultConfig/#499
=== CONT  TestDefaultConfig/#488
=== CONT  TestDefaultConfig/#487
=== CONT  TestDefaultConfig/#00
=== CONT  TestDefaultConfig/#452
=== CONT  TestDefaultConfig/#451
=== CONT  TestDefaultConfig/#450
=== CONT  TestDefaultConfig/#449
=== CONT  TestDefaultConfig/#448
=== CONT  TestDefaultConfig/#447
=== CONT  TestDefaultConfig/#446
=== CONT  TestDefaultConfig/#445
=== CONT  TestDefaultConfig/#444
=== CONT  TestDefaultConfig/#443
=== CONT  TestDefaultConfig/#442
=== CONT  TestDefaultConfig/#441
=== CONT  TestDefaultConfig/#440
=== CONT  TestDefaultConfig/#439
=== CONT  TestDefaultConfig/#438
=== CONT  TestDefaultConfig/#437
=== CONT  TestDefaultConfig/#436
jones - 2019/12/30 18:53:02.270954 [INFO] agent: Synced service "web1-sidecar-proxy"
jones - 2019/12/30 18:53:02.271028 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:53:02.271114 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/30 18:53:02.271147 [DEBUG] agent: Node info in sync
=== CONT  TestDefaultConfig/#435
=== CONT  TestDefaultConfig/#434
=== CONT  TestDefaultConfig/#433
=== CONT  TestDefaultConfig/#432
=== CONT  TestDefaultConfig/#431
=== CONT  TestDefaultConfig/#430
=== CONT  TestDefaultConfig/#429
=== CONT  TestDefaultConfig/#428
=== CONT  TestDefaultConfig/#427
=== CONT  TestDefaultConfig/#386
=== CONT  TestDefaultConfig/#426
=== CONT  TestDefaultConfig/#425
=== CONT  TestDefaultConfig/#424
=== CONT  TestDefaultConfig/#423
=== CONT  TestDefaultConfig/#422
=== CONT  TestDefaultConfig/#421
=== CONT  TestDefaultConfig/#420
jones - 2019/12/30 18:53:03.281498 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/30 18:53:03.282153 [DEBUG] consul: Skipping self join check for "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" since the cluster is too small
jones - 2019/12/30 18:53:03.282402 [INFO] consul: member 'Node f632792c-c81a-fbfb-b7c4-e99bdb454ade' joined, marking health alive
jones - 2019/12/30 18:53:03.297388 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
=== CONT  TestDefaultConfig/#419
=== CONT  TestDefaultConfig/#418
=== CONT  TestDefaultConfig/#417
=== CONT  TestDefaultConfig/#416
=== CONT  TestDefaultConfig/#415
=== CONT  TestDefaultConfig/#414
=== CONT  TestDefaultConfig/#413
=== CONT  TestDefaultConfig/#412
=== CONT  TestDefaultConfig/#411
=== CONT  TestDefaultConfig/#410
=== CONT  TestDefaultConfig/#409
=== CONT  TestDefaultConfig/#408
=== CONT  TestDefaultConfig/#407
=== CONT  TestDefaultConfig/#406
=== CONT  TestDefaultConfig/#405
=== CONT  TestDefaultConfig/#404
=== CONT  TestDefaultConfig/#403
=== CONT  TestDefaultConfig/#402
=== CONT  TestDefaultConfig/#401
=== CONT  TestDefaultConfig/#400
=== CONT  TestDefaultConfig/#399
=== CONT  TestDefaultConfig/#398
=== CONT  TestDefaultConfig/#397
=== CONT  TestDefaultConfig/#396
=== CONT  TestDefaultConfig/#395
=== CONT  TestDefaultConfig/#394
=== CONT  TestDefaultConfig/#393
=== CONT  TestDefaultConfig/#392
=== CONT  TestDefaultConfig/#391
=== CONT  TestDefaultConfig/#390
=== CONT  TestDefaultConfig/#389
=== CONT  TestDefaultConfig/#388
=== CONT  TestDefaultConfig/#387
=== CONT  TestDefaultConfig/#385
=== CONT  TestDefaultConfig/#384
=== CONT  TestDefaultConfig/#383
=== CONT  TestDefaultConfig/#382
=== CONT  TestDefaultConfig/#381
=== CONT  TestDefaultConfig/#380
=== CONT  TestDefaultConfig/#379
=== CONT  TestDefaultConfig/#378
=== CONT  TestDefaultConfig/#377
=== CONT  TestDefaultConfig/#376
=== CONT  TestDefaultConfig/#375
=== CONT  TestDefaultConfig/#374
=== CONT  TestDefaultConfig/#373
=== CONT  TestDefaultConfig/#372
=== CONT  TestDefaultConfig/#371
=== CONT  TestDefaultConfig/#370
=== CONT  TestDefaultConfig/#369
=== CONT  TestDefaultConfig/#368
=== CONT  TestDefaultConfig/#367
=== CONT  TestDefaultConfig/#366
=== CONT  TestDefaultConfig/#336
=== CONT  TestDefaultConfig/#365
=== CONT  TestDefaultConfig/#364
=== CONT  TestDefaultConfig/#363
=== CONT  TestDefaultConfig/#362
=== CONT  TestDefaultConfig/#361
=== CONT  TestDefaultConfig/#360
=== CONT  TestDefaultConfig/#359
=== CONT  TestDefaultConfig/#358
=== CONT  TestDefaultConfig/#357
=== CONT  TestDefaultConfig/#356
=== CONT  TestDefaultConfig/#355
=== CONT  TestDefaultConfig/#354
=== CONT  TestDefaultConfig/#353
=== CONT  TestDefaultConfig/#352
=== CONT  TestDefaultConfig/#351
=== CONT  TestDefaultConfig/#350
=== CONT  TestDefaultConfig/#349
=== CONT  TestDefaultConfig/#348
=== CONT  TestDefaultConfig/#347
=== CONT  TestDefaultConfig/#346
=== CONT  TestDefaultConfig/#345
=== CONT  TestDefaultConfig/#344
=== CONT  TestDefaultConfig/#343
=== CONT  TestDefaultConfig/#342
=== CONT  TestDefaultConfig/#341
=== CONT  TestDefaultConfig/#340
=== CONT  TestDefaultConfig/#339
=== CONT  TestDefaultConfig/#338
=== CONT  TestDefaultConfig/#337
=== CONT  TestDefaultConfig/#335
=== CONT  TestDefaultConfig/#334
=== CONT  TestDefaultConfig/#333
=== CONT  TestDefaultConfig/#332
=== CONT  TestDefaultConfig/#331
=== CONT  TestDefaultConfig/#330
=== CONT  TestDefaultConfig/#329
=== CONT  TestDefaultConfig/#328
=== CONT  TestDefaultConfig/#327
=== CONT  TestDefaultConfig/#326
=== CONT  TestDefaultConfig/#325
=== CONT  TestDefaultConfig/#324
=== CONT  TestDefaultConfig/#195
=== CONT  TestDefaultConfig/#323
=== CONT  TestDefaultConfig/#322
=== CONT  TestDefaultConfig/#321
=== CONT  TestDefaultConfig/#320
=== CONT  TestDefaultConfig/#319
=== CONT  TestDefaultConfig/#318
=== CONT  TestDefaultConfig/#317
=== CONT  TestDefaultConfig/#316
=== CONT  TestDefaultConfig/#315
=== CONT  TestDefaultConfig/#314
=== CONT  TestDefaultConfig/#313
=== CONT  TestDefaultConfig/#312
=== CONT  TestDefaultConfig/#311
=== CONT  TestDefaultConfig/#310
=== CONT  TestDefaultConfig/#309
=== CONT  TestDefaultConfig/#308
=== CONT  TestDefaultConfig/#307
=== CONT  TestDefaultConfig/#301
=== CONT  TestDefaultConfig/#306
=== CONT  TestDefaultConfig/#305
=== CONT  TestDefaultConfig/#304
=== CONT  TestDefaultConfig/#303
=== CONT  TestDefaultConfig/#302
=== CONT  TestDefaultConfig/#300
=== CONT  TestDefaultConfig/#299
=== CONT  TestDefaultConfig/#298
=== CONT  TestDefaultConfig/#297
=== CONT  TestDefaultConfig/#296
=== CONT  TestDefaultConfig/#295
=== CONT  TestDefaultConfig/#294
=== CONT  TestDefaultConfig/#293
=== CONT  TestDefaultConfig/#292
=== CONT  TestDefaultConfig/#291
=== CONT  TestDefaultConfig/#290
=== CONT  TestDefaultConfig/#289
=== CONT  TestDefaultConfig/#288
=== CONT  TestDefaultConfig/#287
=== CONT  TestDefaultConfig/#286
=== CONT  TestDefaultConfig/#285
=== CONT  TestDefaultConfig/#284
=== CONT  TestDefaultConfig/#283
=== CONT  TestDefaultConfig/#282
=== CONT  TestDefaultConfig/#281
=== CONT  TestDefaultConfig/#280
=== CONT  TestDefaultConfig/#279
=== CONT  TestDefaultConfig/#278
=== CONT  TestDefaultConfig/#277
=== CONT  TestDefaultConfig/#276
=== CONT  TestDefaultConfig/#275
=== CONT  TestDefaultConfig/#274
=== CONT  TestDefaultConfig/#273
=== CONT  TestDefaultConfig/#272
=== CONT  TestDefaultConfig/#271
=== CONT  TestDefaultConfig/#270
=== CONT  TestDefaultConfig/#269
=== CONT  TestDefaultConfig/#268
=== CONT  TestDefaultConfig/#267
=== CONT  TestDefaultConfig/#266
=== CONT  TestDefaultConfig/#265
=== CONT  TestDefaultConfig/#264
=== CONT  TestDefaultConfig/#263
=== CONT  TestDefaultConfig/#262
=== CONT  TestDefaultConfig/#261
=== CONT  TestDefaultConfig/#260
=== CONT  TestDefaultConfig/#259
=== CONT  TestDefaultConfig/#258
=== CONT  TestDefaultConfig/#257
=== CONT  TestDefaultConfig/#256
=== CONT  TestDefaultConfig/#255
=== CONT  TestDefaultConfig/#254
=== CONT  TestDefaultConfig/#253
=== CONT  TestDefaultConfig/#252
=== CONT  TestDefaultConfig/#251
=== CONT  TestDefaultConfig/#250
=== CONT  TestDefaultConfig/#249
=== CONT  TestDefaultConfig/#248
=== CONT  TestDefaultConfig/#247
=== CONT  TestDefaultConfig/#246
=== CONT  TestDefaultConfig/#238
=== CONT  TestDefaultConfig/#245
=== CONT  TestDefaultConfig/#244
=== CONT  TestDefaultConfig/#243
=== CONT  TestDefaultConfig/#242
=== CONT  TestDefaultConfig/#241
=== CONT  TestDefaultConfig/#240
=== CONT  TestDefaultConfig/#239
=== CONT  TestDefaultConfig/#237
=== CONT  TestDefaultConfig/#236
=== CONT  TestDefaultConfig/#235
=== CONT  TestDefaultConfig/#234
=== CONT  TestDefaultConfig/#233
=== CONT  TestDefaultConfig/#232
=== CONT  TestDefaultConfig/#231
=== CONT  TestDefaultConfig/#230
=== CONT  TestDefaultConfig/#229
=== CONT  TestDefaultConfig/#228
=== CONT  TestDefaultConfig/#227
=== CONT  TestDefaultConfig/#226
=== CONT  TestDefaultConfig/#224
=== CONT  TestDefaultConfig/#225
=== CONT  TestDefaultConfig/#223
=== CONT  TestDefaultConfig/#222
=== CONT  TestDefaultConfig/#221
=== CONT  TestDefaultConfig/#220
=== CONT  TestDefaultConfig/#219
=== CONT  TestDefaultConfig/#218
=== CONT  TestDefaultConfig/#217
=== CONT  TestDefaultConfig/#216
=== CONT  TestDefaultConfig/#215
=== CONT  TestDefaultConfig/#214
=== CONT  TestDefaultConfig/#213
=== CONT  TestDefaultConfig/#212
=== CONT  TestDefaultConfig/#211
=== CONT  TestDefaultConfig/#210
=== CONT  TestDefaultConfig/#209
=== CONT  TestDefaultConfig/#208
=== CONT  TestDefaultConfig/#207
=== CONT  TestDefaultConfig/#206
=== CONT  TestDefaultConfig/#204
=== CONT  TestDefaultConfig/#205
=== CONT  TestDefaultConfig/#203
=== CONT  TestDefaultConfig/#202
=== CONT  TestDefaultConfig/#201
=== CONT  TestDefaultConfig/#200
=== CONT  TestDefaultConfig/#199
=== CONT  TestDefaultConfig/#198
=== CONT  TestDefaultConfig/#197
=== CONT  TestDefaultConfig/#196
=== CONT  TestDefaultConfig/#66
=== CONT  TestDefaultConfig/#494
=== CONT  TestDefaultConfig/#499
=== CONT  TestDefaultConfig/#498
=== CONT  TestDefaultConfig/#497
=== CONT  TestDefaultConfig/#496
=== CONT  TestDefaultConfig/#495
=== CONT  TestDefaultConfig/#99
=== CONT  TestDefaultConfig/#194
=== CONT  TestDefaultConfig/#193
=== CONT  TestDefaultConfig/#192
=== CONT  TestDefaultConfig/#191
=== CONT  TestDefaultConfig/#190
=== CONT  TestDefaultConfig/#189
=== CONT  TestDefaultConfig/#188
=== CONT  TestDefaultConfig/#187
=== CONT  TestDefaultConfig/#186
=== CONT  TestDefaultConfig/#185
=== CONT  TestDefaultConfig/#184
=== CONT  TestDefaultConfig/#183
=== CONT  TestDefaultConfig/#182
=== CONT  TestDefaultConfig/#181
=== CONT  TestDefaultConfig/#73
=== CONT  TestDefaultConfig/#180
=== CONT  TestDefaultConfig/#179
=== CONT  TestDefaultConfig/#178
=== CONT  TestDefaultConfig/#177
=== CONT  TestDefaultConfig/#176
=== CONT  TestDefaultConfig/#175
=== CONT  TestDefaultConfig/#174
=== CONT  TestDefaultConfig/#173
=== CONT  TestDefaultConfig/#172
=== CONT  TestDefaultConfig/#171
=== CONT  TestDefaultConfig/#170
=== CONT  TestDefaultConfig/#169
=== CONT  TestDefaultConfig/#168
=== CONT  TestDefaultConfig/#167
=== CONT  TestDefaultConfig/#166
=== CONT  TestDefaultConfig/#165
=== CONT  TestDefaultConfig/#164
=== CONT  TestDefaultConfig/#163
=== CONT  TestDefaultConfig/#162
=== CONT  TestDefaultConfig/#115
=== CONT  TestDefaultConfig/#161
=== CONT  TestDefaultConfig/#160
=== CONT  TestDefaultConfig/#159
=== CONT  TestDefaultConfig/#158
=== CONT  TestDefaultConfig/#157
=== CONT  TestDefaultConfig/#156
=== CONT  TestDefaultConfig/#155
=== CONT  TestDefaultConfig/#154
=== CONT  TestDefaultConfig/#153
=== CONT  TestDefaultConfig/#152
=== CONT  TestDefaultConfig/#123
=== CONT  TestDefaultConfig/#151
=== CONT  TestDefaultConfig/#150
=== CONT  TestDefaultConfig/#149
=== CONT  TestDefaultConfig/#148
=== CONT  TestDefaultConfig/#147
=== CONT  TestDefaultConfig/#146
=== CONT  TestDefaultConfig/#145
=== CONT  TestDefaultConfig/#144
=== CONT  TestDefaultConfig/#143
=== CONT  TestDefaultConfig/#142
=== CONT  TestDefaultConfig/#141
=== CONT  TestDefaultConfig/#140
=== CONT  TestDefaultConfig/#139
=== CONT  TestDefaultConfig/#138
=== CONT  TestDefaultConfig/#137
=== CONT  TestDefaultConfig/#136
=== CONT  TestDefaultConfig/#135
=== CONT  TestDefaultConfig/#134
=== CONT  TestDefaultConfig/#133
=== CONT  TestDefaultConfig/#132
=== CONT  TestDefaultConfig/#131
=== CONT  TestDefaultConfig/#130
=== CONT  TestDefaultConfig/#129
=== CONT  TestDefaultConfig/#128
=== CONT  TestDefaultConfig/#127
=== CONT  TestDefaultConfig/#126
=== CONT  TestDefaultConfig/#125
=== CONT  TestDefaultConfig/#124
=== CONT  TestDefaultConfig/#122
=== CONT  TestDefaultConfig/#121
=== CONT  TestDefaultConfig/#120
=== CONT  TestDefaultConfig/#119
=== CONT  TestDefaultConfig/#118
=== CONT  TestDefaultConfig/#117
=== CONT  TestDefaultConfig/#116
=== CONT  TestDefaultConfig/#114
=== CONT  TestDefaultConfig/#113
=== CONT  TestDefaultConfig/#112
=== CONT  TestDefaultConfig/#111
=== CONT  TestDefaultConfig/#110
=== CONT  TestDefaultConfig/#109
=== CONT  TestDefaultConfig/#108
=== CONT  TestDefaultConfig/#107
=== CONT  TestDefaultConfig/#106
=== CONT  TestDefaultConfig/#105
=== CONT  TestDefaultConfig/#104
=== CONT  TestDefaultConfig/#103
=== CONT  TestDefaultConfig/#102
=== CONT  TestDefaultConfig/#101
=== CONT  TestDefaultConfig/#100
=== CONT  TestDefaultConfig/#470
=== CONT  TestDefaultConfig/#486
=== CONT  TestDefaultConfig/#485
=== CONT  TestDefaultConfig/#484
=== CONT  TestDefaultConfig/#483
=== CONT  TestDefaultConfig/#482
=== CONT  TestDefaultConfig/#481
=== CONT  TestDefaultConfig/#480
=== CONT  TestDefaultConfig/#479
=== CONT  TestDefaultConfig/#478
=== CONT  TestDefaultConfig/#477
=== CONT  TestDefaultConfig/#476
=== CONT  TestDefaultConfig/#475
=== CONT  TestDefaultConfig/#474
=== CONT  TestDefaultConfig/#473
=== CONT  TestDefaultConfig/#472
=== CONT  TestDefaultConfig/#471
=== CONT  TestDefaultConfig/#49
=== CONT  TestDefaultConfig/#98
=== CONT  TestDefaultConfig/#97
=== CONT  TestDefaultConfig/#96
=== CONT  TestDefaultConfig/#95
=== CONT  TestDefaultConfig/#94
=== CONT  TestDefaultConfig/#93
=== CONT  TestDefaultConfig/#92
=== CONT  TestDefaultConfig/#91
=== CONT  TestDefaultConfig/#90
=== CONT  TestDefaultConfig/#89
=== CONT  TestDefaultConfig/#88
=== CONT  TestDefaultConfig/#87
=== CONT  TestDefaultConfig/#86
=== CONT  TestDefaultConfig/#85
=== CONT  TestDefaultConfig/#84
=== CONT  TestDefaultConfig/#83
=== CONT  TestDefaultConfig/#82
=== CONT  TestDefaultConfig/#81
=== CONT  TestDefaultConfig/#80
=== CONT  TestDefaultConfig/#79
=== CONT  TestDefaultConfig/#78
=== CONT  TestDefaultConfig/#77
=== CONT  TestDefaultConfig/#76
=== CONT  TestDefaultConfig/#75
=== CONT  TestDefaultConfig/#74
=== CONT  TestDefaultConfig/#72
=== CONT  TestDefaultConfig/#71
=== CONT  TestDefaultConfig/#70
=== CONT  TestDefaultConfig/#69
=== CONT  TestDefaultConfig/#68
=== CONT  TestDefaultConfig/#67
=== CONT  TestDefaultConfig/#65
=== CONT  TestDefaultConfig/#64
=== CONT  TestDefaultConfig/#63
=== CONT  TestDefaultConfig/#62
=== CONT  TestDefaultConfig/#61
=== CONT  TestDefaultConfig/#60
=== CONT  TestDefaultConfig/#59
=== CONT  TestDefaultConfig/#58
=== CONT  TestDefaultConfig/#57
=== CONT  TestDefaultConfig/#56
=== CONT  TestDefaultConfig/#55
=== CONT  TestDefaultConfig/#54
=== CONT  TestDefaultConfig/#53
=== CONT  TestDefaultConfig/#52
=== CONT  TestDefaultConfig/#51
=== CONT  TestDefaultConfig/#50
=== CONT  TestDefaultConfig/#491
=== CONT  TestDefaultConfig/#493
=== CONT  TestDefaultConfig/#492
=== CONT  TestDefaultConfig/#25
=== CONT  TestDefaultConfig/#48
=== CONT  TestDefaultConfig/#47
=== CONT  TestDefaultConfig/#46
=== CONT  TestDefaultConfig/#45
=== CONT  TestDefaultConfig/#44
=== CONT  TestDefaultConfig/#43
=== CONT  TestDefaultConfig/#42
=== CONT  TestDefaultConfig/#41
=== CONT  TestDefaultConfig/#40
=== CONT  TestDefaultConfig/#39
=== CONT  TestDefaultConfig/#38
=== CONT  TestDefaultConfig/#37
=== CONT  TestDefaultConfig/#36
=== CONT  TestDefaultConfig/#35
=== CONT  TestDefaultConfig/#34
=== CONT  TestDefaultConfig/#33
=== CONT  TestDefaultConfig/#32
=== CONT  TestDefaultConfig/#31
=== CONT  TestDefaultConfig/#30
=== CONT  TestDefaultConfig/#29
=== CONT  TestDefaultConfig/#28
=== CONT  TestDefaultConfig/#27
=== CONT  TestDefaultConfig/#26
=== CONT  TestDefaultConfig/#490
=== CONT  TestDefaultConfig/#489
=== CONT  TestDefaultConfig/#13
=== CONT  TestDefaultConfig/#24
=== CONT  TestDefaultConfig/#23
=== CONT  TestDefaultConfig/#22
=== CONT  TestDefaultConfig/#21
=== CONT  TestDefaultConfig/#20
=== CONT  TestDefaultConfig/#19
=== CONT  TestDefaultConfig/#18
=== CONT  TestDefaultConfig/#17
=== CONT  TestDefaultConfig/#16
=== CONT  TestDefaultConfig/#15
=== CONT  TestDefaultConfig/#14
=== CONT  TestDefaultConfig/#07
=== CONT  TestDefaultConfig/#12
=== CONT  TestDefaultConfig/#11
=== CONT  TestDefaultConfig/#10
=== CONT  TestDefaultConfig/#09
=== CONT  TestDefaultConfig/#08
=== CONT  TestDefaultConfig/#461
=== CONT  TestDefaultConfig/#469
=== CONT  TestDefaultConfig/#468
=== CONT  TestDefaultConfig/#467
=== CONT  TestDefaultConfig/#466
=== CONT  TestDefaultConfig/#465
=== CONT  TestDefaultConfig/#464
=== CONT  TestDefaultConfig/#463
=== CONT  TestDefaultConfig/#462
=== CONT  TestDefaultConfig/#04
=== CONT  TestDefaultConfig/#06
=== CONT  TestDefaultConfig/#05
=== CONT  TestDefaultConfig/#02
=== CONT  TestDefaultConfig/#03
=== CONT  TestDefaultConfig/#457
=== CONT  TestDefaultConfig/#460
=== CONT  TestDefaultConfig/#459
=== CONT  TestDefaultConfig/#458
=== CONT  TestDefaultConfig/#455
=== CONT  TestDefaultConfig/#456
=== CONT  TestDefaultConfig/#454
=== CONT  TestDefaultConfig/#453
=== CONT  TestDefaultConfig/#01
--- PASS: TestDefaultConfig (0.14s)
    --- PASS: TestDefaultConfig/#488 (0.27s)
    --- PASS: TestDefaultConfig/#00 (0.36s)
    --- PASS: TestDefaultConfig/#487 (0.40s)
    --- PASS: TestDefaultConfig/#452 (0.42s)
    --- PASS: TestDefaultConfig/#451 (0.19s)
    --- PASS: TestDefaultConfig/#448 (0.09s)
    --- PASS: TestDefaultConfig/#450 (0.22s)
    --- PASS: TestDefaultConfig/#449 (0.20s)
    --- PASS: TestDefaultConfig/#447 (0.15s)
    --- PASS: TestDefaultConfig/#446 (0.15s)
    --- PASS: TestDefaultConfig/#444 (0.14s)
    --- PASS: TestDefaultConfig/#445 (0.21s)
    --- PASS: TestDefaultConfig/#443 (0.19s)
    --- PASS: TestDefaultConfig/#442 (0.36s)
    --- PASS: TestDefaultConfig/#441 (0.32s)
    --- PASS: TestDefaultConfig/#439 (0.29s)
    --- PASS: TestDefaultConfig/#440 (0.32s)
    --- PASS: TestDefaultConfig/#438 (0.17s)
    --- PASS: TestDefaultConfig/#436 (0.15s)
    --- PASS: TestDefaultConfig/#437 (0.18s)
    --- PASS: TestDefaultConfig/#435 (0.18s)
    --- PASS: TestDefaultConfig/#434 (0.17s)
    --- PASS: TestDefaultConfig/#433 (0.15s)
    --- PASS: TestDefaultConfig/#432 (0.19s)
    --- PASS: TestDefaultConfig/#431 (0.16s)
    --- PASS: TestDefaultConfig/#429 (0.22s)
    --- PASS: TestDefaultConfig/#430 (0.28s)
    --- PASS: TestDefaultConfig/#428 (0.33s)
    --- PASS: TestDefaultConfig/#427 (0.33s)
    --- PASS: TestDefaultConfig/#386 (0.28s)
    --- PASS: TestDefaultConfig/#424 (0.19s)
    --- PASS: TestDefaultConfig/#425 (0.23s)
    --- PASS: TestDefaultConfig/#426 (0.34s)
    --- PASS: TestDefaultConfig/#422 (0.16s)
    --- PASS: TestDefaultConfig/#423 (0.24s)
    --- PASS: TestDefaultConfig/#420 (0.19s)
    --- PASS: TestDefaultConfig/#421 (0.25s)
    --- PASS: TestDefaultConfig/#419 (0.16s)
    --- PASS: TestDefaultConfig/#418 (0.36s)
    --- PASS: TestDefaultConfig/#417 (0.32s)
    --- PASS: TestDefaultConfig/#415 (0.35s)
    --- PASS: TestDefaultConfig/#416 (0.47s)
    --- PASS: TestDefaultConfig/#413 (0.24s)
    --- PASS: TestDefaultConfig/#414 (0.33s)
    --- PASS: TestDefaultConfig/#412 (0.19s)
    --- PASS: TestDefaultConfig/#411 (0.19s)
    --- PASS: TestDefaultConfig/#409 (0.15s)
    --- PASS: TestDefaultConfig/#408 (0.17s)
    --- PASS: TestDefaultConfig/#410 (0.28s)
    --- PASS: TestDefaultConfig/#407 (0.23s)
    --- PASS: TestDefaultConfig/#405 (0.17s)
    --- PASS: TestDefaultConfig/#406 (0.35s)
    --- PASS: TestDefaultConfig/#404 (0.34s)
    --- PASS: TestDefaultConfig/#403 (0.35s)
    --- PASS: TestDefaultConfig/#400 (0.17s)
    --- PASS: TestDefaultConfig/#401 (0.22s)
    --- PASS: TestDefaultConfig/#402 (0.39s)
    --- PASS: TestDefaultConfig/#399 (0.13s)
    --- PASS: TestDefaultConfig/#396 (0.11s)
    --- PASS: TestDefaultConfig/#397 (0.18s)
    --- PASS: TestDefaultConfig/#398 (0.20s)
    --- PASS: TestDefaultConfig/#394 (0.19s)
    --- PASS: TestDefaultConfig/#395 (0.27s)
    --- PASS: TestDefaultConfig/#393 (0.25s)
    --- PASS: TestDefaultConfig/#392 (0.38s)
    --- PASS: TestDefaultConfig/#391 (0.31s)
    --- PASS: TestDefaultConfig/#390 (0.47s)
    --- PASS: TestDefaultConfig/#389 (0.39s)
    --- PASS: TestDefaultConfig/#388 (0.27s)
    --- PASS: TestDefaultConfig/#387 (0.32s)
    --- PASS: TestDefaultConfig/#385 (0.17s)
    --- PASS: TestDefaultConfig/#383 (0.17s)
    --- PASS: TestDefaultConfig/#384 (0.21s)
    --- PASS: TestDefaultConfig/#382 (0.22s)
    --- PASS: TestDefaultConfig/#381 (0.20s)
    --- PASS: TestDefaultConfig/#380 (0.22s)
    --- PASS: TestDefaultConfig/#379 (0.23s)
    --- PASS: TestDefaultConfig/#377 (0.27s)
    --- PASS: TestDefaultConfig/#378 (0.44s)
    --- PASS: TestDefaultConfig/#376 (0.41s)
    --- PASS: TestDefaultConfig/#375 (0.39s)
    --- PASS: TestDefaultConfig/#373 (0.21s)
    --- PASS: TestDefaultConfig/#372 (0.17s)
    --- PASS: TestDefaultConfig/#374 (0.35s)
    --- PASS: TestDefaultConfig/#371 (0.25s)
    --- PASS: TestDefaultConfig/#370 (0.22s)
    --- PASS: TestDefaultConfig/#368 (0.19s)
    --- PASS: TestDefaultConfig/#369 (0.24s)
    --- PASS: TestDefaultConfig/#367 (0.20s)
    --- PASS: TestDefaultConfig/#336 (0.28s)
    --- PASS: TestDefaultConfig/#366 (0.37s)
    --- PASS: TestDefaultConfig/#364 (0.38s)
    --- PASS: TestDefaultConfig/#365 (0.53s)
    --- PASS: TestDefaultConfig/#363 (0.35s)
    --- PASS: TestDefaultConfig/#362 (0.33s)
    --- PASS: TestDefaultConfig/#361 (0.26s)
    --- PASS: TestDefaultConfig/#360 (0.21s)
    --- PASS: TestDefaultConfig/#359 (0.16s)
    --- PASS: TestDefaultConfig/#358 (0.17s)
    --- PASS: TestDefaultConfig/#357 (0.19s)
    --- PASS: TestDefaultConfig/#356 (0.17s)
    --- PASS: TestDefaultConfig/#355 (0.18s)
    --- PASS: TestDefaultConfig/#354 (0.16s)
    --- PASS: TestDefaultConfig/#353 (0.15s)
    --- PASS: TestDefaultConfig/#352 (0.27s)
    --- PASS: TestDefaultConfig/#351 (0.42s)
    --- PASS: TestDefaultConfig/#350 (0.44s)
    --- PASS: TestDefaultConfig/#349 (0.40s)
    --- PASS: TestDefaultConfig/#348 (0.33s)
    --- PASS: TestDefaultConfig/#347 (0.18s)
    --- PASS: TestDefaultConfig/#345 (0.17s)
    --- PASS: TestDefaultConfig/#346 (0.21s)
    --- PASS: TestDefaultConfig/#344 (0.14s)
    --- PASS: TestDefaultConfig/#343 (0.24s)
    --- PASS: TestDefaultConfig/#342 (0.20s)
    --- PASS: TestDefaultConfig/#341 (0.18s)
    --- PASS: TestDefaultConfig/#340 (0.25s)
    --- PASS: TestDefaultConfig/#337 (0.31s)
    --- PASS: TestDefaultConfig/#339 (0.35s)
    --- PASS: TestDefaultConfig/#338 (0.35s)
    --- PASS: TestDefaultConfig/#335 (0.30s)
    --- PASS: TestDefaultConfig/#333 (0.13s)
    --- PASS: TestDefaultConfig/#332 (0.16s)
    --- PASS: TestDefaultConfig/#334 (0.18s)
    --- PASS: TestDefaultConfig/#330 (0.12s)
    --- PASS: TestDefaultConfig/#331 (0.25s)
    --- PASS: TestDefaultConfig/#329 (0.15s)
    --- PASS: TestDefaultConfig/#328 (0.18s)
    --- PASS: TestDefaultConfig/#327 (0.16s)
    --- PASS: TestDefaultConfig/#325 (0.27s)
    --- PASS: TestDefaultConfig/#326 (0.35s)
    --- PASS: TestDefaultConfig/#324 (0.34s)
    --- PASS: TestDefaultConfig/#195 (0.33s)
    --- PASS: TestDefaultConfig/#323 (0.31s)
    --- PASS: TestDefaultConfig/#322 (0.27s)
    --- PASS: TestDefaultConfig/#320 (0.21s)
    --- PASS: TestDefaultConfig/#321 (0.31s)
    --- PASS: TestDefaultConfig/#318 (0.17s)
    --- PASS: TestDefaultConfig/#319 (0.21s)
    --- PASS: TestDefaultConfig/#316 (0.13s)
    --- PASS: TestDefaultConfig/#317 (0.23s)
    --- PASS: TestDefaultConfig/#315 (0.16s)
    --- PASS: TestDefaultConfig/#313 (0.23s)
    --- PASS: TestDefaultConfig/#314 (0.34s)
    --- PASS: TestDefaultConfig/#312 (0.28s)
    --- PASS: TestDefaultConfig/#311 (0.36s)
    --- PASS: TestDefaultConfig/#310 (0.32s)
    --- PASS: TestDefaultConfig/#309 (0.29s)
    --- PASS: TestDefaultConfig/#308 (0.28s)
    --- PASS: TestDefaultConfig/#301 (0.13s)
    --- PASS: TestDefaultConfig/#307 (0.21s)
    --- PASS: TestDefaultConfig/#305 (0.13s)
    --- PASS: TestDefaultConfig/#306 (0.18s)
    --- PASS: TestDefaultConfig/#304 (0.17s)
    --- PASS: TestDefaultConfig/#303 (0.15s)
    --- PASS: TestDefaultConfig/#302 (0.14s)
    --- PASS: TestDefaultConfig/#299 (0.31s)
    --- PASS: TestDefaultConfig/#300 (0.42s)
    --- PASS: TestDefaultConfig/#298 (0.49s)
    --- PASS: TestDefaultConfig/#296 (0.24s)
    --- PASS: TestDefaultConfig/#297 (0.48s)
    --- PASS: TestDefaultConfig/#295 (0.26s)
    --- PASS: TestDefaultConfig/#292 (0.13s)
    --- PASS: TestDefaultConfig/#294 (0.21s)
    --- PASS: TestDefaultConfig/#291 (0.18s)
    --- PASS: TestDefaultConfig/#293 (0.26s)
    --- PASS: TestDefaultConfig/#290 (0.18s)
    --- PASS: TestDefaultConfig/#289 (0.22s)
    --- PASS: TestDefaultConfig/#287 (0.17s)
    --- PASS: TestDefaultConfig/#288 (0.19s)
    --- PASS: TestDefaultConfig/#286 (0.33s)
    --- PASS: TestDefaultConfig/#285 (0.33s)
    --- PASS: TestDefaultConfig/#284 (0.43s)
    --- PASS: TestDefaultConfig/#283 (0.42s)
    --- PASS: TestDefaultConfig/#281 (0.25s)
    --- PASS: TestDefaultConfig/#282 (0.29s)
    --- PASS: TestDefaultConfig/#279 (0.17s)
    --- PASS: TestDefaultConfig/#280 (0.17s)
    --- PASS: TestDefaultConfig/#277 (0.18s)
    --- PASS: TestDefaultConfig/#276 (0.16s)
    --- PASS: TestDefaultConfig/#278 (0.25s)
    --- PASS: TestDefaultConfig/#275 (0.18s)
    --- PASS: TestDefaultConfig/#274 (0.15s)
    --- PASS: TestDefaultConfig/#273 (0.38s)
    --- PASS: TestDefaultConfig/#272 (0.39s)
    --- PASS: TestDefaultConfig/#271 (0.38s)
    --- PASS: TestDefaultConfig/#270 (0.33s)
    --- PASS: TestDefaultConfig/#268 (0.14s)
    --- PASS: TestDefaultConfig/#269 (0.16s)
    --- PASS: TestDefaultConfig/#267 (0.18s)
    --- PASS: TestDefaultConfig/#266 (0.18s)
    --- PASS: TestDefaultConfig/#265 (0.15s)
    --- PASS: TestDefaultConfig/#264 (0.16s)
    --- PASS: TestDefaultConfig/#263 (0.21s)
    --- PASS: TestDefaultConfig/#262 (0.17s)
    --- PASS: TestDefaultConfig/#260 (0.13s)
    --- PASS: TestDefaultConfig/#261 (0.32s)
    --- PASS: TestDefaultConfig/#259 (0.28s)
    --- PASS: TestDefaultConfig/#258 (0.33s)
    --- PASS: TestDefaultConfig/#257 (0.33s)
    --- PASS: TestDefaultConfig/#256 (0.23s)
    --- PASS: TestDefaultConfig/#255 (0.20s)
    --- PASS: TestDefaultConfig/#254 (0.22s)
    --- PASS: TestDefaultConfig/#253 (0.26s)
    --- PASS: TestDefaultConfig/#252 (0.27s)
    --- PASS: TestDefaultConfig/#251 (0.28s)
    --- PASS: TestDefaultConfig/#249 (0.17s)
    --- PASS: TestDefaultConfig/#250 (0.26s)
    --- PASS: TestDefaultConfig/#247 (0.31s)
    --- PASS: TestDefaultConfig/#248 (0.36s)
    --- PASS: TestDefaultConfig/#238 (0.32s)
    --- PASS: TestDefaultConfig/#246 (0.36s)
    --- PASS: TestDefaultConfig/#245 (0.21s)
    --- PASS: TestDefaultConfig/#244 (0.26s)
    --- PASS: TestDefaultConfig/#243 (0.22s)
    --- PASS: TestDefaultConfig/#242 (0.21s)
    --- PASS: TestDefaultConfig/#241 (0.19s)
    --- PASS: TestDefaultConfig/#237 (0.17s)
    --- PASS: TestDefaultConfig/#239 (0.20s)
    --- PASS: TestDefaultConfig/#240 (0.24s)
    --- PASS: TestDefaultConfig/#236 (0.20s)
    --- PASS: TestDefaultConfig/#235 (0.13s)
    --- PASS: TestDefaultConfig/#234 (0.16s)
    --- PASS: TestDefaultConfig/#233 (0.43s)
    --- PASS: TestDefaultConfig/#232 (0.38s)
    --- PASS: TestDefaultConfig/#230 (0.38s)
    --- PASS: TestDefaultConfig/#231 (0.51s)
    --- PASS: TestDefaultConfig/#228 (0.21s)
    --- PASS: TestDefaultConfig/#229 (0.22s)
    --- PASS: TestDefaultConfig/#227 (0.19s)
    --- PASS: TestDefaultConfig/#224 (0.15s)
    --- PASS: TestDefaultConfig/#226 (0.21s)
    --- PASS: TestDefaultConfig/#225 (0.16s)
    --- PASS: TestDefaultConfig/#223 (0.15s)
    --- PASS: TestDefaultConfig/#222 (0.23s)
    --- PASS: TestDefaultConfig/#221 (0.25s)
    --- PASS: TestDefaultConfig/#219 (0.29s)
    --- PASS: TestDefaultConfig/#220 (0.33s)
    --- PASS: TestDefaultConfig/#217 (0.25s)
    --- PASS: TestDefaultConfig/#218 (0.28s)
    --- PASS: TestDefaultConfig/#215 (0.19s)
    --- PASS: TestDefaultConfig/#216 (0.21s)
    --- PASS: TestDefaultConfig/#213 (0.16s)
    --- PASS: TestDefaultConfig/#211 (0.15s)
    --- PASS: TestDefaultConfig/#212 (0.19s)
    --- PASS: TestDefaultConfig/#214 (0.22s)
    --- PASS: TestDefaultConfig/#210 (0.13s)
    --- PASS: TestDefaultConfig/#209 (0.18s)
    --- PASS: TestDefaultConfig/#207 (0.15s)
    --- PASS: TestDefaultConfig/#208 (0.27s)
    --- PASS: TestDefaultConfig/#206 (0.38s)
    --- PASS: TestDefaultConfig/#205 (0.32s)
    --- PASS: TestDefaultConfig/#203 (0.25s)
    --- PASS: TestDefaultConfig/#204 (0.36s)
    --- PASS: TestDefaultConfig/#202 (0.17s)
    --- PASS: TestDefaultConfig/#200 (0.14s)
    --- PASS: TestDefaultConfig/#201 (0.24s)
    --- PASS: TestDefaultConfig/#199 (0.22s)
    --- PASS: TestDefaultConfig/#198 (0.18s)
    --- PASS: TestDefaultConfig/#197 (0.20s)
    --- PASS: TestDefaultConfig/#196 (0.16s)
    --- PASS: TestDefaultConfig/#66 (0.38s)
    --- PASS: TestDefaultConfig/#499 (0.33s)
    --- PASS: TestDefaultConfig/#494 (0.43s)
    --- PASS: TestDefaultConfig/#498 (0.38s)
    --- PASS: TestDefaultConfig/#497 (0.36s)
    --- PASS: TestDefaultConfig/#495 (0.28s)
    --- PASS: TestDefaultConfig/#496 (0.34s)
    --- PASS: TestDefaultConfig/#99 (0.35s)
    --- PASS: TestDefaultConfig/#194 (0.28s)
    --- PASS: TestDefaultConfig/#192 (0.28s)
    --- PASS: TestDefaultConfig/#193 (0.32s)
    --- PASS: TestDefaultConfig/#191 (0.23s)
    --- PASS: TestDefaultConfig/#188 (0.15s)
    --- PASS: TestDefaultConfig/#189 (0.22s)
    --- PASS: TestDefaultConfig/#190 (0.33s)
    --- PASS: TestDefaultConfig/#187 (0.28s)
    --- PASS: TestDefaultConfig/#186 (0.23s)
    --- PASS: TestDefaultConfig/#183 (0.15s)
    --- PASS: TestDefaultConfig/#184 (0.18s)
    --- PASS: TestDefaultConfig/#185 (0.26s)
    --- PASS: TestDefaultConfig/#182 (0.17s)
    --- PASS: TestDefaultConfig/#181 (0.16s)
    --- PASS: TestDefaultConfig/#73 (0.15s)
    --- PASS: TestDefaultConfig/#180 (0.14s)
    --- PASS: TestDefaultConfig/#179 (0.13s)
    --- PASS: TestDefaultConfig/#176 (0.11s)
    --- PASS: TestDefaultConfig/#178 (0.13s)
    --- PASS: TestDefaultConfig/#177 (0.17s)
    --- PASS: TestDefaultConfig/#175 (0.31s)
    --- PASS: TestDefaultConfig/#173 (0.30s)
    --- PASS: TestDefaultConfig/#174 (0.34s)
    --- PASS: TestDefaultConfig/#172 (0.32s)
    --- PASS: TestDefaultConfig/#171 (0.17s)
    --- PASS: TestDefaultConfig/#170 (0.20s)
    --- PASS: TestDefaultConfig/#168 (0.14s)
    --- PASS: TestDefaultConfig/#169 (0.20s)
    --- PASS: TestDefaultConfig/#167 (0.14s)
    --- PASS: TestDefaultConfig/#166 (0.14s)
    --- PASS: TestDefaultConfig/#164 (0.13s)
    --- PASS: TestDefaultConfig/#165 (0.22s)
    --- PASS: TestDefaultConfig/#163 (0.28s)
    --- PASS: TestDefaultConfig/#162 (0.28s)
    --- PASS: TestDefaultConfig/#161 (0.24s)
    --- PASS: TestDefaultConfig/#115 (0.35s)
    --- PASS: TestDefaultConfig/#159 (0.10s)
    --- PASS: TestDefaultConfig/#160 (0.17s)
    --- PASS: TestDefaultConfig/#157 (0.15s)
    --- PASS: TestDefaultConfig/#158 (0.19s)
    --- PASS: TestDefaultConfig/#155 (0.14s)
    --- PASS: TestDefaultConfig/#156 (0.19s)
    --- PASS: TestDefaultConfig/#152 (0.12s)
    --- PASS: TestDefaultConfig/#154 (0.15s)
    --- PASS: TestDefaultConfig/#153 (0.16s)
    --- PASS: TestDefaultConfig/#123 (0.26s)
    --- PASS: TestDefaultConfig/#151 (0.28s)
    --- PASS: TestDefaultConfig/#149 (0.31s)
    --- PASS: TestDefaultConfig/#150 (0.32s)
    --- PASS: TestDefaultConfig/#148 (0.24s)
    --- PASS: TestDefaultConfig/#147 (0.15s)
    --- PASS: TestDefaultConfig/#146 (0.18s)
    --- PASS: TestDefaultConfig/#145 (0.21s)
    --- PASS: TestDefaultConfig/#144 (0.14s)
    --- PASS: TestDefaultConfig/#143 (0.14s)
    --- PASS: TestDefaultConfig/#142 (0.17s)
    --- PASS: TestDefaultConfig/#140 (0.16s)
    --- PASS: TestDefaultConfig/#141 (0.18s)
    --- PASS: TestDefaultConfig/#139 (0.19s)
    --- PASS: TestDefaultConfig/#137 (0.34s)
    --- PASS: TestDefaultConfig/#138 (0.36s)
    --- PASS: TestDefaultConfig/#135 (0.31s)
    --- PASS: TestDefaultConfig/#136 (0.37s)
    --- PASS: TestDefaultConfig/#133 (0.15s)
    --- PASS: TestDefaultConfig/#134 (0.19s)
    --- PASS: TestDefaultConfig/#131 (0.15s)
    --- PASS: TestDefaultConfig/#132 (0.21s)
    --- PASS: TestDefaultConfig/#130 (0.17s)
    --- PASS: TestDefaultConfig/#127 (0.15s)
    --- PASS: TestDefaultConfig/#128 (0.19s)
    --- PASS: TestDefaultConfig/#129 (0.25s)
    --- PASS: TestDefaultConfig/#126 (0.40s)
    --- PASS: TestDefaultConfig/#124 (0.49s)
    --- PASS: TestDefaultConfig/#125 (0.50s)
    --- PASS: TestDefaultConfig/#122 (0.51s)
    --- PASS: TestDefaultConfig/#121 (0.24s)
    --- PASS: TestDefaultConfig/#119 (0.21s)
    --- PASS: TestDefaultConfig/#117 (0.18s)
    --- PASS: TestDefaultConfig/#118 (0.21s)
    --- PASS: TestDefaultConfig/#120 (0.30s)
    --- PASS: TestDefaultConfig/#116 (0.18s)
    --- PASS: TestDefaultConfig/#114 (0.19s)
    --- PASS: TestDefaultConfig/#113 (0.21s)
    --- PASS: TestDefaultConfig/#112 (0.26s)
    --- PASS: TestDefaultConfig/#111 (0.30s)
    --- PASS: TestDefaultConfig/#110 (0.39s)
    --- PASS: TestDefaultConfig/#109 (0.38s)
    --- PASS: TestDefaultConfig/#108 (0.43s)
    --- PASS: TestDefaultConfig/#107 (0.37s)
    --- PASS: TestDefaultConfig/#106 (0.32s)
    --- PASS: TestDefaultConfig/#105 (0.31s)
    --- PASS: TestDefaultConfig/#104 (0.20s)
    --- PASS: TestDefaultConfig/#103 (0.21s)
    --- PASS: TestDefaultConfig/#101 (0.27s)
    --- PASS: TestDefaultConfig/#102 (0.34s)
    --- PASS: TestDefaultConfig/#100 (0.34s)
    --- PASS: TestDefaultConfig/#470 (0.31s)
    --- PASS: TestDefaultConfig/#486 (0.25s)
    --- PASS: TestDefaultConfig/#484 (0.36s)
    --- PASS: TestDefaultConfig/#485 (0.48s)
    --- PASS: TestDefaultConfig/#483 (0.60s)
    --- PASS: TestDefaultConfig/#482 (0.51s)
    --- PASS: TestDefaultConfig/#481 (0.42s)
    --- PASS: TestDefaultConfig/#480 (0.36s)
    --- PASS: TestDefaultConfig/#479 (0.20s)
    --- PASS: TestDefaultConfig/#478 (0.26s)
    --- PASS: TestDefaultConfig/#476 (0.17s)
    --- PASS: TestDefaultConfig/#475 (0.18s)
    --- PASS: TestDefaultConfig/#474 (0.22s)
    --- PASS: TestDefaultConfig/#477 (0.36s)
    --- PASS: TestDefaultConfig/#473 (0.27s)
    --- PASS: TestDefaultConfig/#472 (0.48s)
    --- PASS: TestDefaultConfig/#49 (0.46s)
    --- PASS: TestDefaultConfig/#471 (0.51s)
    --- PASS: TestDefaultConfig/#98 (0.42s)
    --- PASS: TestDefaultConfig/#97 (0.27s)
    --- PASS: TestDefaultConfig/#96 (0.19s)
    --- PASS: TestDefaultConfig/#95 (0.20s)
    --- PASS: TestDefaultConfig/#94 (0.22s)
    --- PASS: TestDefaultConfig/#93 (0.15s)
    --- PASS: TestDefaultConfig/#92 (0.21s)
    --- PASS: TestDefaultConfig/#90 (0.16s)
    --- PASS: TestDefaultConfig/#89 (0.15s)
    --- PASS: TestDefaultConfig/#91 (0.23s)
    --- PASS: TestDefaultConfig/#88 (0.24s)
    --- PASS: TestDefaultConfig/#87 (0.25s)
    --- PASS: TestDefaultConfig/#85 (0.23s)
    --- PASS: TestDefaultConfig/#86 (0.27s)
    --- PASS: TestDefaultConfig/#84 (0.12s)
    --- PASS: TestDefaultConfig/#83 (0.14s)
    --- PASS: TestDefaultConfig/#82 (0.14s)
    --- PASS: TestDefaultConfig/#80 (0.11s)
    --- PASS: TestDefaultConfig/#81 (0.14s)
    --- PASS: TestDefaultConfig/#79 (0.15s)
    --- PASS: TestDefaultConfig/#77 (0.14s)
    --- PASS: TestDefaultConfig/#76 (0.12s)
    --- PASS: TestDefaultConfig/#78 (0.18s)
    --- PASS: TestDefaultConfig/#75 (0.31s)
    --- PASS: TestDefaultConfig/#74 (0.31s)
    --- PASS: TestDefaultConfig/#72 (0.32s)
    --- PASS: TestDefaultConfig/#71 (0.30s)
    --- PASS: TestDefaultConfig/#70 (0.14s)
    --- PASS: TestDefaultConfig/#68 (0.15s)
    --- PASS: TestDefaultConfig/#69 (0.20s)
    --- PASS: TestDefaultConfig/#67 (0.18s)
    --- PASS: TestDefaultConfig/#65 (0.10s)
    --- PASS: TestDefaultConfig/#64 (0.19s)
    --- PASS: TestDefaultConfig/#62 (0.14s)
    --- PASS: TestDefaultConfig/#63 (0.15s)
    --- PASS: TestDefaultConfig/#61 (0.12s)
    --- PASS: TestDefaultConfig/#57 (0.33s)
    --- PASS: TestDefaultConfig/#60 (0.34s)
    --- PASS: TestDefaultConfig/#59 (0.36s)
    --- PASS: TestDefaultConfig/#58 (0.36s)
    --- PASS: TestDefaultConfig/#55 (0.14s)
    --- PASS: TestDefaultConfig/#56 (0.15s)
    --- PASS: TestDefaultConfig/#54 (0.14s)
    --- PASS: TestDefaultConfig/#53 (0.16s)
    --- PASS: TestDefaultConfig/#50 (0.13s)
    --- PASS: TestDefaultConfig/#52 (0.16s)
    --- PASS: TestDefaultConfig/#51 (0.21s)
    --- PASS: TestDefaultConfig/#491 (0.18s)
    --- PASS: TestDefaultConfig/#493 (0.19s)
    --- PASS: TestDefaultConfig/#492 (0.28s)
    --- PASS: TestDefaultConfig/#25 (0.38s)
    --- PASS: TestDefaultConfig/#48 (0.41s)
    --- PASS: TestDefaultConfig/#47 (0.31s)
    --- PASS: TestDefaultConfig/#46 (0.26s)
    --- PASS: TestDefaultConfig/#45 (0.17s)
    --- PASS: TestDefaultConfig/#43 (0.14s)
    --- PASS: TestDefaultConfig/#44 (0.17s)
    --- PASS: TestDefaultConfig/#42 (0.16s)
    --- PASS: TestDefaultConfig/#41 (0.12s)
    --- PASS: TestDefaultConfig/#39 (0.12s)
    --- PASS: TestDefaultConfig/#40 (0.18s)
    --- PASS: TestDefaultConfig/#37 (0.15s)
    --- PASS: TestDefaultConfig/#38 (0.24s)
    --- PASS: TestDefaultConfig/#36 (0.26s)
    --- PASS: TestDefaultConfig/#35 (0.27s)
    --- PASS: TestDefaultConfig/#34 (0.37s)
    --- PASS: TestDefaultConfig/#32 (0.25s)
    --- PASS: TestDefaultConfig/#33 (0.34s)
    --- PASS: TestDefaultConfig/#31 (0.22s)
    --- PASS: TestDefaultConfig/#28 (0.10s)
    --- PASS: TestDefaultConfig/#29 (0.14s)
    --- PASS: TestDefaultConfig/#30 (0.18s)
    --- PASS: TestDefaultConfig/#27 (0.19s)
    --- PASS: TestDefaultConfig/#490 (0.12s)
    --- PASS: TestDefaultConfig/#26 (0.19s)
    --- PASS: TestDefaultConfig/#489 (0.14s)
    --- PASS: TestDefaultConfig/#13 (0.13s)
    --- PASS: TestDefaultConfig/#24 (0.24s)
    --- PASS: TestDefaultConfig/#23 (0.31s)
    --- PASS: TestDefaultConfig/#21 (0.27s)
    --- PASS: TestDefaultConfig/#22 (0.40s)
    --- PASS: TestDefaultConfig/#18 (0.15s)
    --- PASS: TestDefaultConfig/#20 (0.28s)
    --- PASS: TestDefaultConfig/#19 (0.19s)
    --- PASS: TestDefaultConfig/#17 (0.17s)
    --- PASS: TestDefaultConfig/#15 (0.18s)
    --- PASS: TestDefaultConfig/#14 (0.16s)
    --- PASS: TestDefaultConfig/#16 (0.20s)
    --- PASS: TestDefaultConfig/#07 (0.16s)
    --- PASS: TestDefaultConfig/#12 (0.27s)
    --- PASS: TestDefaultConfig/#10 (0.34s)
    --- PASS: TestDefaultConfig/#11 (0.35s)
    --- PASS: TestDefaultConfig/#09 (0.42s)
    --- PASS: TestDefaultConfig/#08 (0.24s)
    --- PASS: TestDefaultConfig/#461 (0.25s)
    --- PASS: TestDefaultConfig/#469 (0.27s)
    --- PASS: TestDefaultConfig/#468 (0.14s)
    --- PASS: TestDefaultConfig/#467 (0.19s)
    --- PASS: TestDefaultConfig/#466 (0.16s)
    --- PASS: TestDefaultConfig/#465 (0.18s)
    --- PASS: TestDefaultConfig/#464 (0.19s)
    --- PASS: TestDefaultConfig/#463 (0.28s)
    --- PASS: TestDefaultConfig/#462 (0.21s)
    --- PASS: TestDefaultConfig/#04 (0.26s)
    --- PASS: TestDefaultConfig/#06 (0.32s)
    --- PASS: TestDefaultConfig/#03 (0.14s)
    --- PASS: TestDefaultConfig/#02 (0.22s)
    --- PASS: TestDefaultConfig/#05 (0.23s)
    --- PASS: TestDefaultConfig/#457 (0.19s)
    --- PASS: TestDefaultConfig/#459 (0.13s)
    --- PASS: TestDefaultConfig/#460 (0.16s)
    --- PASS: TestDefaultConfig/#458 (0.19s)
    --- PASS: TestDefaultConfig/#456 (0.10s)
    --- PASS: TestDefaultConfig/#455 (0.18s)
    --- PASS: TestDefaultConfig/#454 (0.17s)
    --- PASS: TestDefaultConfig/#453 (0.14s)
    --- PASS: TestDefaultConfig/#01 (0.20s)
=== RUN   TestTxnEndpoint_Bad_JSON
=== PAUSE TestTxnEndpoint_Bad_JSON
=== RUN   TestTxnEndpoint_Bad_Size_Item
=== PAUSE TestTxnEndpoint_Bad_Size_Item
=== RUN   TestTxnEndpoint_Bad_Size_Net
=== PAUSE TestTxnEndpoint_Bad_Size_Net
=== RUN   TestTxnEndpoint_Bad_Size_Ops
=== PAUSE TestTxnEndpoint_Bad_Size_Ops
=== RUN   TestTxnEndpoint_KV_Actions
=== PAUSE TestTxnEndpoint_KV_Actions
=== RUN   TestTxnEndpoint_UpdateCheck
=== PAUSE TestTxnEndpoint_UpdateCheck
=== RUN   TestUiIndex
=== PAUSE TestUiIndex
=== RUN   TestUiNodes
=== PAUSE TestUiNodes
=== RUN   TestUiNodes_Filter
=== PAUSE TestUiNodes_Filter
=== RUN   TestUiNodeInfo
=== PAUSE TestUiNodeInfo
=== RUN   TestUiServices
=== PAUSE TestUiServices
=== RUN   TestValidateUserEventParams
=== PAUSE TestValidateUserEventParams
=== RUN   TestShouldProcessUserEvent
=== PAUSE TestShouldProcessUserEvent
=== RUN   TestIngestUserEvent
=== PAUSE TestIngestUserEvent
=== RUN   TestFireReceiveEvent
=== PAUSE TestFireReceiveEvent
=== RUN   TestUserEventToken
=== PAUSE TestUserEventToken
=== RUN   TestStringHash
=== PAUSE TestStringHash
=== RUN   TestSetFilePermissions
=== PAUSE TestSetFilePermissions
=== RUN   TestDurationFixer
--- PASS: TestDurationFixer (0.00s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== RUN   TestForwardSignals
=== RUN   TestForwardSignals/signal-interrupt
=== RUN   TestForwardSignals/signal-terminated
--- PASS: TestForwardSignals (1.18s)
    --- PASS: TestForwardSignals/signal-interrupt (0.63s)
    --- PASS: TestForwardSignals/signal-terminated (0.55s)
=== RUN   TestMakeWatchHandler
=== PAUSE TestMakeWatchHandler
=== RUN   TestMakeHTTPWatchHandler
=== PAUSE TestMakeHTTPWatchHandler
=== CONT  TestACL_Legacy_Disabled_Response
=== CONT  TestAgent_purgeCheckState
=== CONT  TestMakeHTTPWatchHandler
=== CONT  TestKVSEndpoint_GET_Raw
2019/12/30 18:53:32 [TRACE] agent: http watch handler 'http://127.0.0.1:33325' output: Ok, i see
--- PASS: TestMakeHTTPWatchHandler (0.02s)
=== CONT  TestKVSEndpoint_AcquireRelease
WARNING: bootstrap = true: do not enable unless necessary
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.024931 [WARN] agent: Node name "Node 3124a1e3-6ee4-8f1f-2401-ace7e2238ff9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.025325 [DEBUG] tlsutil: Update with version 1
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.027536 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:33.037636 [WARN] agent: Node name "Node b2d8c26f-419f-e45d-dcc0-b8affbed31a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:33.038585 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:33.041697 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_purgeCheckState - 2019/12/30 18:53:33.046122 [WARN] agent: Node name "Node 0855e1a9-151a-d342-d6fd-a7f2f77cc49c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_purgeCheckState - 2019/12/30 18:53:33.046932 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_purgeCheckState - 2019/12/30 18:53:33.050169 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:33.050302 [WARN] agent: Node name "Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:33.059071 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:33.061714 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3124a1e3-6ee4-8f1f-2401-ace7e2238ff9 Address:127.0.0.1:17746}]
2019/12/30 18:53:33 [INFO]  raft: Node at 127.0.0.1:17746 [Follower] entering Follower state (Leader: "")
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.893626 [INFO] serf: EventMemberJoin: Node 3124a1e3-6ee4-8f1f-2401-ace7e2238ff9.dc1 127.0.0.1
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.897343 [INFO] serf: EventMemberJoin: Node 3124a1e3-6ee4-8f1f-2401-ace7e2238ff9 127.0.0.1
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.898195 [INFO] consul: Handled member-join event for server "Node 3124a1e3-6ee4-8f1f-2401-ace7e2238ff9.dc1" in area "wan"
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.898453 [INFO] consul: Adding LAN server Node 3124a1e3-6ee4-8f1f-2401-ace7e2238ff9 (Addr: tcp/127.0.0.1:17746) (DC: dc1)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.898603 [INFO] agent: Started DNS server 127.0.0.1:17741 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.909858 [INFO] agent: Started DNS server 127.0.0.1:17741 (udp)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.912626 [INFO] agent: Started HTTP server on 127.0.0.1:17742 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:33.912746 [INFO] agent: started state syncer
2019/12/30 18:53:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:33 [INFO]  raft: Node at 127.0.0.1:17746 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0855e1a9-151a-d342-d6fd-a7f2f77cc49c Address:127.0.0.1:17758}]
2019/12/30 18:53:33 [INFO]  raft: Node at 127.0.0.1:17758 [Follower] entering Follower state (Leader: "")
2019/12/30 18:53:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1e9f3ef7-c8a0-7c79-9831-1397fa7743c9 Address:127.0.0.1:17764}]
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:33.998287 [INFO] serf: EventMemberJoin: Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9.dc1 127.0.0.1
2019/12/30 18:53:33 [INFO]  raft: Node at 127.0.0.1:17764 [Follower] entering Follower state (Leader: "")
TestAgent_purgeCheckState - 2019/12/30 18:53:34.002896 [INFO] serf: EventMemberJoin: Node 0855e1a9-151a-d342-d6fd-a7f2f77cc49c.dc1 127.0.0.1
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.012794 [INFO] serf: EventMemberJoin: Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9 127.0.0.1
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.015127 [INFO] consul: Adding LAN server Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9 (Addr: tcp/127.0.0.1:17764) (DC: dc1)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.015385 [INFO] consul: Handled member-join event for server "Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9.dc1" in area "wan"
TestAgent_purgeCheckState - 2019/12/30 18:53:34.016474 [INFO] serf: EventMemberJoin: Node 0855e1a9-151a-d342-d6fd-a7f2f77cc49c 127.0.0.1
TestAgent_purgeCheckState - 2019/12/30 18:53:34.017122 [INFO] consul: Adding LAN server Node 0855e1a9-151a-d342-d6fd-a7f2f77cc49c (Addr: tcp/127.0.0.1:17758) (DC: dc1)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.017866 [INFO] agent: Started DNS server 127.0.0.1:17759 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.018529 [INFO] agent: Started DNS server 127.0.0.1:17759 (udp)
TestAgent_purgeCheckState - 2019/12/30 18:53:34.020265 [INFO] consul: Handled member-join event for server "Node 0855e1a9-151a-d342-d6fd-a7f2f77cc49c.dc1" in area "wan"
TestAgent_purgeCheckState - 2019/12/30 18:53:34.021433 [INFO] agent: Started DNS server 127.0.0.1:17753 (tcp)
TestAgent_purgeCheckState - 2019/12/30 18:53:34.021539 [INFO] agent: Started DNS server 127.0.0.1:17753 (udp)
TestAgent_purgeCheckState - 2019/12/30 18:53:34.024006 [INFO] agent: Started HTTP server on 127.0.0.1:17754 (tcp)
TestAgent_purgeCheckState - 2019/12/30 18:53:34.024112 [INFO] agent: started state syncer
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.026816 [INFO] agent: Started HTTP server on 127.0.0.1:17760 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.026914 [INFO] agent: started state syncer
2019/12/30 18:53:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17758 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17764 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b2d8c26f-419f-e45d-dcc0-b8affbed31a9 Address:127.0.0.1:17752}]
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17752 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.169830 [INFO] serf: EventMemberJoin: Node b2d8c26f-419f-e45d-dcc0-b8affbed31a9.dc1 127.0.0.1
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.173618 [INFO] serf: EventMemberJoin: Node b2d8c26f-419f-e45d-dcc0-b8affbed31a9 127.0.0.1
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.174465 [INFO] consul: Handled member-join event for server "Node b2d8c26f-419f-e45d-dcc0-b8affbed31a9.dc1" in area "wan"
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.174771 [INFO] consul: Adding LAN server Node b2d8c26f-419f-e45d-dcc0-b8affbed31a9 (Addr: tcp/127.0.0.1:17752) (DC: dc1)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.176904 [INFO] agent: Started DNS server 127.0.0.1:17747 (udp)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.176989 [INFO] agent: Started DNS server 127.0.0.1:17747 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.179357 [INFO] agent: Started HTTP server on 127.0.0.1:17748 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.179536 [INFO] agent: started state syncer
2019/12/30 18:53:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17752 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17746 [Leader] entering Leader state
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:34.519797 [INFO] consul: cluster leadership acquired
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:34.520224 [INFO] consul: New leader elected: Node 3124a1e3-6ee4-8f1f-2401-ace7e2238ff9
2019/12/30 18:53:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17758 [Leader] entering Leader state
2019/12/30 18:53:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17764 [Leader] entering Leader state
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.679355 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:34.679906 [INFO] consul: New leader elected: Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9
TestAgent_purgeCheckState - 2019/12/30 18:53:34.680179 [INFO] consul: cluster leadership acquired
TestAgent_purgeCheckState - 2019/12/30 18:53:34.680556 [INFO] consul: New leader elected: Node 0855e1a9-151a-d342-d6fd-a7f2f77cc49c
2019/12/30 18:53:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:34 [INFO]  raft: Node at 127.0.0.1:17752 [Leader] entering Leader state
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.749698 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:34.750244 [INFO] consul: New leader elected: Node b2d8c26f-419f-e45d-dcc0-b8affbed31a9
TestAgent_purgeCheckState - 2019/12/30 18:53:34.869324 [INFO] agent: Requesting shutdown
TestAgent_purgeCheckState - 2019/12/30 18:53:34.869496 [INFO] consul: shutting down server
TestAgent_purgeCheckState - 2019/12/30 18:53:34.869558 [WARN] serf: Shutdown without a Leave
TestAgent_purgeCheckState - 2019/12/30 18:53:34.869688 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_purgeCheckState - 2019/12/30 18:53:34.986918 [WARN] serf: Shutdown without a Leave
TestAgent_purgeCheckState - 2019/12/30 18:53:35.105934 [INFO] manager: shutting down
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:35.106881 [INFO] agent: Synced node info
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:35.106996 [DEBUG] agent: Node info in sync
TestAgent_purgeCheckState - 2019/12/30 18:53:35.107611 [INFO] agent: consul server down
TestAgent_purgeCheckState - 2019/12/30 18:53:35.107694 [INFO] agent: shutdown complete
TestAgent_purgeCheckState - 2019/12/30 18:53:35.107754 [INFO] agent: Stopping DNS server 127.0.0.1:17753 (tcp)
TestAgent_purgeCheckState - 2019/12/30 18:53:35.107913 [INFO] agent: Stopping DNS server 127.0.0.1:17753 (udp)
TestAgent_purgeCheckState - 2019/12/30 18:53:35.108104 [INFO] agent: Stopping HTTP server 127.0.0.1:17754 (tcp)
TestAgent_purgeCheckState - 2019/12/30 18:53:35.108329 [INFO] agent: Waiting for endpoints to shut down
TestAgent_purgeCheckState - 2019/12/30 18:53:35.108405 [INFO] agent: Endpoints down
--- PASS: TestAgent_purgeCheckState (2.20s)
=== CONT  TestKVSEndpoint_ListKeys
TestAgent_purgeCheckState - 2019/12/30 18:53:35.109131 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.115018 [INFO] agent: Synced node info
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:35.115322 [DEBUG] agent: Node info in sync
=== RUN   TestACL_Legacy_Disabled_Response/0
=== RUN   TestACL_Legacy_Disabled_Response/1
=== RUN   TestACL_Legacy_Disabled_Response/2
=== RUN   TestACL_Legacy_Disabled_Response/3
=== RUN   TestACL_Legacy_Disabled_Response/4
=== RUN   TestACL_Legacy_Disabled_Response/5
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.130935 [INFO] agent: Requesting shutdown
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.131030 [INFO] consul: shutting down server
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.131078 [WARN] serf: Shutdown without a Leave
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.142693 [DEBUG] agent: Node info in sync
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.142844 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:35.182788 [WARN] agent: Node name "Node 68a20e94-68c7-1553-436f-0e28030fb92a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:35.183300 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:35.186432 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.207904 [INFO] agent: Requesting shutdown
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.208019 [INFO] consul: shutting down server
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.208073 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.208509 [INFO] agent: Synced node info
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.330990 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.332488 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.671649 [INFO] manager: shutting down
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.673843 [INFO] agent: consul server down
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.673919 [INFO] agent: shutdown complete
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.673984 [INFO] agent: Stopping DNS server 127.0.0.1:17747 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.674145 [INFO] agent: Stopping DNS server 127.0.0.1:17747 (udp)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.674334 [INFO] agent: Stopping HTTP server 127.0.0.1:17748 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.674624 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.674711 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_GET_Raw (2.77s)
=== CONT  TestKVSEndpoint_CAS
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.685952 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.686325 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.686540 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.686610 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.686660 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestKVSEndpoint_GET_Raw - 2019/12/30 18:53:35.686712 [ERR] consul: failed to transfer leadership in 3 attempts
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:35.686818 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_CAS - 2019/12/30 18:53:35.762398 [WARN] agent: Node name "Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_CAS - 2019/12/30 18:53:35.762890 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_CAS - 2019/12/30 18:53:35.765220 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.124630 [INFO] agent: consul server down
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.124709 [INFO] agent: shutdown complete
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.124770 [INFO] agent: Stopping DNS server 127.0.0.1:17741 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.124911 [INFO] agent: Stopping DNS server 127.0.0.1:17741 (udp)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125072 [INFO] agent: Stopping HTTP server 127.0.0.1:17742 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.124658 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125278 [INFO] agent: Waiting for endpoints to shut down
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125340 [INFO] agent: Endpoints down
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125346 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
--- PASS: TestACL_Legacy_Disabled_Response (3.22s)
    --- PASS: TestACL_Legacy_Disabled_Response/0 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/1 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/2 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/3 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/4 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/5 (0.00s)
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125460 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
=== CONT  TestKVSEndpoint_DELETE_CAS
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125510 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestACL_Legacy_Disabled_Response - 2019/12/30 18:53:36.125555 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:36.397295 [WARN] agent: Node name "Node d06e8d68-25f1-2a50-34bb-8e7d35cd4d14" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:36.397766 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:36.400060 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:36.857243 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:36.857862 [DEBUG] consul: Skipping self join check for "Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9" since the cluster is too small
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:36.858040 [INFO] consul: member 'Node 1e9f3ef7-c8a0-7c79-9831-1397fa7743c9' joined, marking health alive
2019/12/30 18:53:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:68a20e94-68c7-1553-436f-0e28030fb92a Address:127.0.0.1:17770}]
2019/12/30 18:53:36 [INFO]  raft: Node at 127.0.0.1:17770 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:36.993967 [INFO] serf: EventMemberJoin: Node 68a20e94-68c7-1553-436f-0e28030fb92a.dc1 127.0.0.1
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:36.997530 [INFO] serf: EventMemberJoin: Node 68a20e94-68c7-1553-436f-0e28030fb92a 127.0.0.1
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:36.998318 [INFO] consul: Handled member-join event for server "Node 68a20e94-68c7-1553-436f-0e28030fb92a.dc1" in area "wan"
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:36.998664 [INFO] consul: Adding LAN server Node 68a20e94-68c7-1553-436f-0e28030fb92a (Addr: tcp/127.0.0.1:17770) (DC: dc1)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:36.998911 [INFO] agent: Started DNS server 127.0.0.1:17765 (udp)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:37.000802 [INFO] agent: Started DNS server 127.0.0.1:17765 (tcp)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:37.003370 [INFO] agent: Started HTTP server on 127.0.0.1:17766 (tcp)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:37.003535 [INFO] agent: started state syncer
2019/12/30 18:53:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:37 [INFO]  raft: Node at 127.0.0.1:17770 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:57d4ed8c-ec10-f0ec-e152-33ccc4080f89 Address:127.0.0.1:17776}]
2019/12/30 18:53:37 [INFO]  raft: Node at 127.0.0.1:17776 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.277791 [INFO] serf: EventMemberJoin: Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89.dc1 127.0.0.1
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.286016 [INFO] serf: EventMemberJoin: Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89 127.0.0.1
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.287626 [INFO] consul: Adding LAN server Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89 (Addr: tcp/127.0.0.1:17776) (DC: dc1)
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.287945 [INFO] consul: Handled member-join event for server "Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89.dc1" in area "wan"
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.289621 [INFO] agent: Started DNS server 127.0.0.1:17771 (udp)
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.291589 [INFO] agent: Started DNS server 127.0.0.1:17771 (tcp)
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.294258 [INFO] agent: Started HTTP server on 127.0.0.1:17772 (tcp)
TestKVSEndpoint_CAS - 2019/12/30 18:53:37.296929 [INFO] agent: started state syncer
2019/12/30 18:53:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:37 [INFO]  raft: Node at 127.0.0.1:17776 [Candidate] entering Candidate state in term 2
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:37.907778 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d06e8d68-25f1-2a50-34bb-8e7d35cd4d14 Address:127.0.0.1:17782}]
2019/12/30 18:53:38 [INFO]  raft: Node at 127.0.0.1:17782 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.035317 [INFO] serf: EventMemberJoin: Node d06e8d68-25f1-2a50-34bb-8e7d35cd4d14.dc1 127.0.0.1
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.038571 [INFO] serf: EventMemberJoin: Node d06e8d68-25f1-2a50-34bb-8e7d35cd4d14 127.0.0.1
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.039448 [INFO] consul: Handled member-join event for server "Node d06e8d68-25f1-2a50-34bb-8e7d35cd4d14.dc1" in area "wan"
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.039747 [INFO] consul: Adding LAN server Node d06e8d68-25f1-2a50-34bb-8e7d35cd4d14 (Addr: tcp/127.0.0.1:17782) (DC: dc1)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.039953 [INFO] agent: Started DNS server 127.0.0.1:17777 (udp)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.040306 [INFO] agent: Started DNS server 127.0.0.1:17777 (tcp)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.042985 [INFO] agent: Started HTTP server on 127.0.0.1:17778 (tcp)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.043087 [INFO] agent: started state syncer
2019/12/30 18:53:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:38 [INFO]  raft: Node at 127.0.0.1:17782 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:38 [INFO]  raft: Node at 127.0.0.1:17770 [Leader] entering Leader state
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:38.325879 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:38.326364 [INFO] consul: New leader elected: Node 68a20e94-68c7-1553-436f-0e28030fb92a
2019/12/30 18:53:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:38 [INFO]  raft: Node at 127.0.0.1:17776 [Leader] entering Leader state
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.435740 [INFO] agent: Requesting shutdown
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.435840 [INFO] consul: shutting down server
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.435909 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_CAS - 2019/12/30 18:53:38.437254 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_CAS - 2019/12/30 18:53:38.437753 [INFO] consul: New leader elected: Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.522895 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.631082 [INFO] manager: shutting down
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.631933 [INFO] agent: consul server down
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.632000 [INFO] agent: shutdown complete
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.632062 [INFO] agent: Stopping DNS server 127.0.0.1:17759 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.632257 [INFO] agent: Stopping DNS server 127.0.0.1:17759 (udp)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.632419 [INFO] agent: Stopping HTTP server 127.0.0.1:17760 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.632626 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_AcquireRelease - 2019/12/30 18:53:38.632703 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_AcquireRelease (5.70s)
=== CONT  TestKVSEndpoint_Recurse
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:38.738663 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_Recurse - 2019/12/30 18:53:38.774272 [WARN] agent: Node name "Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_Recurse - 2019/12/30 18:53:38.774963 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_Recurse - 2019/12/30 18:53:38.780170 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:38 [INFO]  raft: Node at 127.0.0.1:17782 [Leader] entering Leader state
TestKVSEndpoint_CAS - 2019/12/30 18:53:38.834827 [INFO] agent: Synced node info
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.834986 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:38.835342 [INFO] consul: New leader elected: Node d06e8d68-25f1-2a50-34bb-8e7d35cd4d14
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:39.377873 [INFO] agent: Synced node info
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:39.711969 [DEBUG] agent: Node info in sync
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:39.712111 [DEBUG] agent: Node info in sync
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.232812 [DEBUG] agent: Node info in sync
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.232944 [DEBUG] agent: Node info in sync
2019/12/30 18:53:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0589e6e4-9315-6ce6-c2a3-a407cbac2618 Address:127.0.0.1:17788}]
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.260784 [INFO] serf: EventMemberJoin: Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618.dc1 127.0.0.1
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.264130 [INFO] serf: EventMemberJoin: Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618 127.0.0.1
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.265744 [INFO] agent: Started DNS server 127.0.0.1:17783 (udp)
2019/12/30 18:53:41 [INFO]  raft: Node at 127.0.0.1:17788 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.269294 [INFO] consul: Handled member-join event for server "Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618.dc1" in area "wan"
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.269500 [INFO] consul: Adding LAN server Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618 (Addr: tcp/127.0.0.1:17788) (DC: dc1)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.270100 [INFO] agent: Started DNS server 127.0.0.1:17783 (tcp)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.272697 [INFO] agent: Started HTTP server on 127.0.0.1:17784 (tcp)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:41.272855 [INFO] agent: started state syncer
2019/12/30 18:53:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:41 [INFO]  raft: Node at 127.0.0.1:17788 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:53:41.382768 [DEBUG] consul: Skipping self join check for "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" since the cluster is too small
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.396661 [INFO] agent: Requesting shutdown
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.396775 [INFO] consul: shutting down server
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.396829 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:41.398901 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.648911 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.856405 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.857300 [INFO] manager: shutting down
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.858295 [INFO] agent: consul server down
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.858362 [INFO] agent: shutdown complete
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.858416 [INFO] agent: Stopping DNS server 127.0.0.1:17771 (tcp)
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.858576 [INFO] agent: Stopping DNS server 127.0.0.1:17771 (udp)
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.858748 [INFO] agent: Stopping HTTP server 127.0.0.1:17772 (tcp)
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.858955 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.859027 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_CAS (6.18s)
=== CONT  TestKVSEndpoint_PUT_GET_DELETE
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.863567 [WARN] consul: error getting server health from "Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:17776: connect: connection reset by peer
TestKVSEndpoint_CAS - 2019/12/30 18:53:41.895357 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.032395 [INFO] agent: Requesting shutdown
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.032492 [INFO] consul: shutting down server
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.032539 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:42.045655 [WARN] agent: Node name "Node fb62e4db-4418-772c-78cd-bd26cadf48d2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:42.046253 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:42.051991 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:42 [INFO]  raft: Node at 127.0.0.1:17788 [Leader] entering Leader state
TestKVSEndpoint_Recurse - 2019/12/30 18:53:42.123303 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_Recurse - 2019/12/30 18:53:42.123718 [INFO] consul: New leader elected: Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.193126 [DEBUG] agent: Node info in sync
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.193273 [DEBUG] agent: Node info in sync
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.206159 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.314492 [INFO] manager: shutting down
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.315350 [INFO] agent: consul server down
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.315416 [INFO] agent: shutdown complete
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.315471 [INFO] agent: Stopping DNS server 127.0.0.1:17777 (tcp)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.315622 [INFO] agent: Stopping DNS server 127.0.0.1:17777 (udp)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.315797 [INFO] agent: Stopping HTTP server 127.0.0.1:17778 (tcp)
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.316049 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.316119 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_DELETE_CAS (6.19s)
=== CONT  TestAgentKeyring_ACL
TestKVSEndpoint_DELETE_CAS - 2019/12/30 18:53:42.343063 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentKeyring_ACL - 2019/12/30 18:53:42.427668 [WARN] agent: Node name "Node f3328b00-0658-1d9b-052a-a97b62f00a57" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentKeyring_ACL - 2019/12/30 18:53:42.428383 [DEBUG] tlsutil: Update with version 1
TestAgentKeyring_ACL - 2019/12/30 18:53:42.440740 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.517645 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.518113 [DEBUG] consul: Skipping self join check for "Node 68a20e94-68c7-1553-436f-0e28030fb92a" since the cluster is too small
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.518283 [INFO] consul: member 'Node 68a20e94-68c7-1553-436f-0e28030fb92a' joined, marking health alive
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.518568 [INFO] agent: Requesting shutdown
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.518638 [INFO] consul: shutting down server
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.518682 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_Recurse - 2019/12/30 18:53:42.520467 [INFO] agent: Synced node info
TestKVSEndpoint_Recurse - 2019/12/30 18:53:42.520603 [DEBUG] agent: Node info in sync
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.681112 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.786712 [INFO] manager: shutting down
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.789171 [INFO] agent: consul server down
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.789242 [INFO] agent: shutdown complete
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.789298 [INFO] agent: Stopping DNS server 127.0.0.1:17765 (tcp)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.789499 [INFO] agent: Stopping DNS server 127.0.0.1:17765 (udp)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.789664 [INFO] agent: Stopping HTTP server 127.0.0.1:17766 (tcp)
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.789951 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_ListKeys - 2019/12/30 18:53:42.790029 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_ListKeys (7.68s)
=== CONT  TestAgent_InitKeyring
=== CONT  TestAgent_InmemKeyrings
--- PASS: TestAgent_InitKeyring (0.00s)
=== RUN   TestAgent_InmemKeyrings/no_keys
TestKVSEndpoint_CAS - 2019/12/30 18:53:42.856480 [WARN] consul: error getting server health from "Node 57d4ed8c-ec10-f0ec-e152-33ccc4080f89": context deadline exceeded
TestKVSEndpoint_CAS - 2019/12/30 18:53:42.856679 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:42.896870 [WARN] agent: Node name "Node 3c12a107-1ef9-f22c-2616-5cad5eb5a72b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:42.897297 [DEBUG] tlsutil: Update with version 1
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:42.911475 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:53:43.198464 [DEBUG] consul: Skipping self join check for "Node 90e88a15-5862-4de0-2f1f-c638261bac76" since the cluster is too small
2019/12/30 18:53:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fb62e4db-4418-772c-78cd-bd26cadf48d2 Address:127.0.0.1:17794}]
2019/12/30 18:53:43 [INFO]  raft: Node at 127.0.0.1:17794 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.204233 [INFO] serf: EventMemberJoin: Node fb62e4db-4418-772c-78cd-bd26cadf48d2.dc1 127.0.0.1
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.233256 [INFO] serf: EventMemberJoin: Node fb62e4db-4418-772c-78cd-bd26cadf48d2 127.0.0.1
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.236941 [INFO] consul: Adding LAN server Node fb62e4db-4418-772c-78cd-bd26cadf48d2 (Addr: tcp/127.0.0.1:17794) (DC: dc1)
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.237478 [INFO] agent: Started DNS server 127.0.0.1:17789 (udp)
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.238326 [INFO] consul: Handled member-join event for server "Node fb62e4db-4418-772c-78cd-bd26cadf48d2.dc1" in area "wan"
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.243414 [INFO] agent: Started DNS server 127.0.0.1:17789 (tcp)
2019/12/30 18:53:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:43 [INFO]  raft: Node at 127.0.0.1:17794 [Candidate] entering Candidate state in term 2
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.264810 [INFO] agent: Started HTTP server on 127.0.0.1:17790 (tcp)
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.264962 [INFO] agent: started state syncer
2019/12/30 18:53:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f3328b00-0658-1d9b-052a-a97b62f00a57 Address:127.0.0.1:17800}]
TestAgentKeyring_ACL - 2019/12/30 18:53:43.596195 [INFO] serf: EventMemberJoin: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1 127.0.0.1
TestAgentKeyring_ACL - 2019/12/30 18:53:43.599642 [INFO] serf: EventMemberJoin: Node f3328b00-0658-1d9b-052a-a97b62f00a57 127.0.0.1
TestAgentKeyring_ACL - 2019/12/30 18:53:43.601131 [INFO] agent: Started DNS server 127.0.0.1:17795 (udp)
2019/12/30 18:53:43 [INFO]  raft: Node at 127.0.0.1:17800 [Follower] entering Follower state (Leader: "")
TestAgentKeyring_ACL - 2019/12/30 18:53:43.603677 [INFO] consul: Adding LAN server Node f3328b00-0658-1d9b-052a-a97b62f00a57 (Addr: tcp/127.0.0.1:17800) (DC: dc1)
TestAgentKeyring_ACL - 2019/12/30 18:53:43.605180 [INFO] agent: Started DNS server 127.0.0.1:17795 (tcp)
TestAgentKeyring_ACL - 2019/12/30 18:53:43.607650 [INFO] agent: Started HTTP server on 127.0.0.1:17796 (tcp)
TestAgentKeyring_ACL - 2019/12/30 18:53:43.607767 [INFO] agent: started state syncer
TestAgentKeyring_ACL - 2019/12/30 18:53:43.609125 [INFO] consul: Handled member-join event for server "Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1" in area "wan"
2019/12/30 18:53:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:43 [INFO]  raft: Node at 127.0.0.1:17800 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:43 [INFO]  raft: Node at 127.0.0.1:17794 [Leader] entering Leader state
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.883606 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:43.884032 [INFO] consul: New leader elected: Node fb62e4db-4418-772c-78cd-bd26cadf48d2
2019/12/30 18:53:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3c12a107-1ef9-f22c-2616-5cad5eb5a72b Address:127.0.0.1:17806}]
2019/12/30 18:53:44 [INFO]  raft: Node at 127.0.0.1:17806 [Follower] entering Follower state (Leader: "")
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.043547 [INFO] serf: EventMemberJoin: Node 3c12a107-1ef9-f22c-2616-5cad5eb5a72b.dc1 127.0.0.1
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.057242 [INFO] serf: EventMemberJoin: Node 3c12a107-1ef9-f22c-2616-5cad5eb5a72b 127.0.0.1
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.059206 [INFO] consul: Adding LAN server Node 3c12a107-1ef9-f22c-2616-5cad5eb5a72b (Addr: tcp/127.0.0.1:17806) (DC: dc1)
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.060019 [INFO] consul: Handled member-join event for server "Node 3c12a107-1ef9-f22c-2616-5cad5eb5a72b.dc1" in area "wan"
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.061274 [INFO] agent: Started DNS server 127.0.0.1:17801 (tcp)
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.061363 [INFO] agent: Started DNS server 127.0.0.1:17801 (udp)
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.063627 [INFO] agent: Started HTTP server on 127.0.0.1:17802 (tcp)
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.063721 [INFO] agent: started state syncer
2019/12/30 18:53:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:44 [INFO]  raft: Node at 127.0.0.1:17806 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:44 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:44 [INFO]  raft: Node at 127.0.0.1:17800 [Leader] entering Leader state
TestAgentKeyring_ACL - 2019/12/30 18:53:44.296884 [INFO] consul: cluster leadership acquired
TestAgentKeyring_ACL - 2019/12/30 18:53:44.297339 [INFO] consul: New leader elected: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:44.304667 [ERR] agent: failed to sync remote state: ACL not found
TestAgentKeyring_ACL - 2019/12/30 18:53:44.356295 [INFO] acl: initializing acls
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:44.562388 [INFO] agent: Synced node info
TestAgentKeyring_ACL - 2019/12/30 18:53:44.710880 [INFO] consul: Created ACL 'global-management' policy
TestAgentKeyring_ACL - 2019/12/30 18:53:44.710980 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentKeyring_ACL - 2019/12/30 18:53:44.715701 [INFO] acl: initializing acls
TestAgentKeyring_ACL - 2019/12/30 18:53:44.715845 [WARN] consul: Configuring a non-UUID master token is deprecated
2019/12/30 18:53:44 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:44 [INFO]  raft: Node at 127.0.0.1:17806 [Leader] entering Leader state
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.809351 [INFO] consul: cluster leadership acquired
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.809836 [INFO] consul: New leader elected: Node 3c12a107-1ef9-f22c-2616-5cad5eb5a72b
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.932296 [INFO] agent: Requesting shutdown
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.932420 [INFO] consul: shutting down server
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:44.932466 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.064895 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.066046 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.066207 [INFO] agent: Requesting shutdown
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.066717 [INFO] consul: shutting down server
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.066908 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.066267 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.071989 [DEBUG] consul: Skipping self join check for "Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618" since the cluster is too small
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.072280 [INFO] consul: member 'Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618' joined, marking health alive
TestAgentKeyring_ACL - 2019/12/30 18:53:45.078676 [INFO] consul: Bootstrapped ACL master token from configuration
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.082794 [WARN] consul: error getting server health from "Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618": rpc error making call: EOF
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.182254 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.184926 [INFO] manager: shutting down
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.216131 [DEBUG] agent: Node info in sync
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.306242 [INFO] manager: shutting down
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.306494 [INFO] agent: consul server down
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.309008 [INFO] agent: shutdown complete
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.309069 [INFO] agent: Stopping DNS server 127.0.0.1:17801 (tcp)
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.309231 [INFO] agent: Stopping DNS server 127.0.0.1:17801 (udp)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.310710 [INFO] agent: consul server down
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.310779 [INFO] agent: shutdown complete
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.310832 [INFO] agent: Stopping DNS server 127.0.0.1:17783 (tcp)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.310952 [INFO] agent: Stopping DNS server 127.0.0.1:17783 (udp)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.311083 [INFO] agent: Stopping HTTP server 127.0.0.1:17784 (tcp)
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.311258 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_Recurse - 2019/12/30 18:53:45.311321 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_Recurse (6.68s)
=== CONT  TestAgent_LoadKeyrings
=== RUN   TestAgent_LoadKeyrings/no_keys
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.309469 [INFO] agent: Stopping HTTP server 127.0.0.1:17802 (tcp)
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.315104 [INFO] agent: Waiting for endpoints to shut down
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.315317 [INFO] agent: Endpoints down
=== RUN   TestAgent_InmemKeyrings/server_with_keys
jones - 2019/12/30 18:53:45.306953 [DEBUG] consul: Skipping self join check for "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" since the cluster is too small
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.306569 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.306636 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_InmemKeyrings/no_keys - 2019/12/30 18:53:45.331144 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgentKeyring_ACL - 2019/12/30 18:53:45.398476 [INFO] consul: Bootstrapped ACL master token from configuration
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:45.450257 [WARN] agent: Node name "Node e3297c66-daad-e3f4-6e69-7c4fd0820a9c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:45.450826 [DEBUG] tlsutil: Update with version 1
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:45.455542 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:45.474003 [WARN] agent: Node name "Node 3d7a1966-eda9-756b-312f-985abf683878" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:45.474938 [DEBUG] tlsutil: Update with version 1
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:45.478382 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.646893 [INFO] consul: Created ACL anonymous token from configuration
TestAgentKeyring_ACL - 2019/12/30 18:53:45.647170 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentKeyring_ACL - 2019/12/30 18:53:45.647987 [INFO] serf: EventMemberUpdate: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:45.648754 [INFO] serf: EventMemberUpdate: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.661722 [INFO] serf: Received list-keys query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.663436 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.664040 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.670912 [INFO] serf: Received list-keys query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.672707 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:45.677068 [INFO] serf: Received install-key query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.691231 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.692822 [INFO] serf: Received install-key query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.694915 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:45.696473 [INFO] serf: Received use-key query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.698458 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.700067 [INFO] serf: Received use-key query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.702911 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:45.705522 [INFO] serf: Received remove-key query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.708892 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.710535 [INFO] serf: Received remove-key query
TestAgentKeyring_ACL - 2019/12/30 18:53:45.712532 [DEBUG] serf: messageQueryResponseType: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:45.713360 [INFO] agent: Requesting shutdown
TestAgentKeyring_ACL - 2019/12/30 18:53:45.713440 [INFO] consul: shutting down server
TestAgentKeyring_ACL - 2019/12/30 18:53:45.713483 [WARN] serf: Shutdown without a Leave
TestAgentKeyring_ACL - 2019/12/30 18:53:45.823641 [WARN] serf: Shutdown without a Leave
TestAgentKeyring_ACL - 2019/12/30 18:53:45.823978 [INFO] consul: Created ACL anonymous token from configuration
TestAgentKeyring_ACL - 2019/12/30 18:53:45.824894 [INFO] serf: EventMemberUpdate: Node f3328b00-0658-1d9b-052a-a97b62f00a57
TestAgentKeyring_ACL - 2019/12/30 18:53:45.825563 [INFO] serf: EventMemberUpdate: Node f3328b00-0658-1d9b-052a-a97b62f00a57.dc1
TestAgentKeyring_ACL - 2019/12/30 18:53:45.906229 [INFO] manager: shutting down
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983158 [INFO] agent: consul server down
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983233 [INFO] agent: shutdown complete
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983293 [INFO] agent: Stopping DNS server 127.0.0.1:17795 (tcp)
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983424 [INFO] agent: Stopping DNS server 127.0.0.1:17795 (udp)
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983429 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983581 [INFO] agent: Stopping HTTP server 127.0.0.1:17796 (tcp)
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983799 [INFO] agent: Waiting for endpoints to shut down
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983819 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgentKeyring_ACL - 2019/12/30 18:53:45.983959 [INFO] agent: Endpoints down
--- PASS: TestAgentKeyring_ACL (3.67s)
=== CONT  TestIntentionsSpecificDelete_good
TestAgentKeyring_ACL - 2019/12/30 18:53:45.989195 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:46.030807 [DEBUG] agent: Node info in sync
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:46.030929 [DEBUG] agent: Node info in sync
TestKVSEndpoint_Recurse - 2019/12/30 18:53:46.064950 [WARN] consul: error getting server health from "Node 0589e6e4-9315-6ce6-c2a3-a407cbac2618": context deadline exceeded
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:46.076954 [WARN] agent: Node name "Node a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:46.077696 [DEBUG] tlsutil: Update with version 1
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:46.081547 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e3297c66-daad-e3f4-6e69-7c4fd0820a9c Address:127.0.0.1:17818}]
2019/12/30 18:53:46 [INFO]  raft: Node at 127.0.0.1:17818 [Follower] entering Follower state (Leader: "")
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.579531 [INFO] serf: EventMemberJoin: Node e3297c66-daad-e3f4-6e69-7c4fd0820a9c.dc1 127.0.0.1
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.587868 [INFO] serf: EventMemberJoin: Node e3297c66-daad-e3f4-6e69-7c4fd0820a9c 127.0.0.1
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.590069 [INFO] consul: Adding LAN server Node e3297c66-daad-e3f4-6e69-7c4fd0820a9c (Addr: tcp/127.0.0.1:17818) (DC: dc1)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.591083 [INFO] consul: Handled member-join event for server "Node e3297c66-daad-e3f4-6e69-7c4fd0820a9c.dc1" in area "wan"
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.593809 [INFO] agent: Started DNS server 127.0.0.1:17813 (tcp)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.594339 [INFO] agent: Started DNS server 127.0.0.1:17813 (udp)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.598753 [INFO] agent: Started HTTP server on 127.0.0.1:17814 (tcp)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:46.598884 [INFO] agent: started state syncer
2019/12/30 18:53:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:46 [INFO]  raft: Node at 127.0.0.1:17818 [Candidate] entering Candidate state in term 2
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:46.710491 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:46.914637 [DEBUG] consul: Skipping self join check for "Node fb62e4db-4418-772c-78cd-bd26cadf48d2" since the cluster is too small
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:46.921408 [INFO] consul: member 'Node fb62e4db-4418-772c-78cd-bd26cadf48d2' joined, marking health alive
2019/12/30 18:53:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3d7a1966-eda9-756b-312f-985abf683878 Address:127.0.0.1:17812}]
2019/12/30 18:53:46 [INFO]  raft: Node at 127.0.0.1:17812 [Follower] entering Follower state (Leader: "")
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.930183 [INFO] serf: EventMemberJoin: Node 3d7a1966-eda9-756b-312f-985abf683878.dc1 127.0.0.1
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.935831 [INFO] serf: EventMemberJoin: Node 3d7a1966-eda9-756b-312f-985abf683878 127.0.0.1
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.938405 [INFO] consul: Adding LAN server Node 3d7a1966-eda9-756b-312f-985abf683878 (Addr: tcp/127.0.0.1:17812) (DC: dc1)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.939248 [INFO] consul: Handled member-join event for server "Node 3d7a1966-eda9-756b-312f-985abf683878.dc1" in area "wan"
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.944589 [INFO] agent: Started DNS server 127.0.0.1:17807 (udp)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.945072 [INFO] agent: Started DNS server 127.0.0.1:17807 (tcp)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.947514 [INFO] agent: Started HTTP server on 127.0.0.1:17808 (tcp)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:46.947608 [INFO] agent: started state syncer
2019/12/30 18:53:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:46 [INFO]  raft: Node at 127.0.0.1:17812 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:53:47.327920 [DEBUG] consul: Skipping self join check for "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" since the cluster is too small
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:47.498160 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:47 [INFO]  raft: Node at 127.0.0.1:17818 [Leader] entering Leader state
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.504566 [INFO] consul: cluster leadership acquired
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.504962 [INFO] consul: New leader elected: Node e3297c66-daad-e3f4-6e69-7c4fd0820a9c
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.515251 [INFO] agent: Requesting shutdown
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.515372 [INFO] consul: shutting down server
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.515428 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.515864 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:53:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f Address:127.0.0.1:17824}]
2019/12/30 18:53:47 [INFO]  raft: Node at 127.0.0.1:17824 [Follower] entering Follower state (Leader: "")
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.613127 [INFO] serf: EventMemberJoin: Node a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f.dc1 127.0.0.1
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.617688 [INFO] serf: EventMemberJoin: Node a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f 127.0.0.1
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.619292 [INFO] consul: Adding LAN server Node a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f (Addr: tcp/127.0.0.1:17824) (DC: dc1)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.620071 [INFO] consul: Handled member-join event for server "Node a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f.dc1" in area "wan"
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.621863 [INFO] agent: Started DNS server 127.0.0.1:17819 (tcp)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.622517 [INFO] agent: Started DNS server 127.0.0.1:17819 (udp)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.625260 [INFO] agent: Started HTTP server on 127.0.0.1:17820 (tcp)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:47.625390 [INFO] agent: started state syncer
2019/12/30 18:53:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:47 [INFO]  raft: Node at 127.0.0.1:17824 [Candidate] entering Candidate state in term 2
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.706136 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.789850 [INFO] manager: shutting down
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.790066 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.791502 [INFO] agent: consul server down
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.791580 [INFO] agent: shutdown complete
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.791657 [INFO] agent: Stopping DNS server 127.0.0.1:17813 (tcp)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.791837 [INFO] agent: Stopping DNS server 127.0.0.1:17813 (udp)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.792035 [INFO] agent: Stopping HTTP server 127.0.0.1:17814 (tcp)
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.792279 [INFO] agent: Waiting for endpoints to shut down
TestAgent_InmemKeyrings/server_with_keys - 2019/12/30 18:53:47.792364 [INFO] agent: Endpoints down
=== RUN   TestAgent_InmemKeyrings/client_with_keys
2019/12/30 18:53:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:47 [INFO]  raft: Node at 127.0.0.1:17812 [Leader] entering Leader state
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:47.796141 [INFO] consul: cluster leadership acquired
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:47.796550 [INFO] consul: New leader elected: Node 3d7a1966-eda9-756b-312f-985abf683878
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.873638 [WARN] agent: Node name "Node 5ceb9ee0-1bf5-5033-04d9-640f15981d89" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.874036 [DEBUG] tlsutil: Update with version 1
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.878912 [INFO] serf: EventMemberJoin: Node 5ceb9ee0-1bf5-5033-04d9-640f15981d89 127.0.0.1
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.880501 [INFO] agent: Started DNS server 127.0.0.1:17825 (udp)
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.881938 [INFO] agent: Started DNS server 127.0.0.1:17825 (tcp)
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.884182 [INFO] agent: Started HTTP server on 127.0.0.1:17826 (tcp)
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.884272 [INFO] agent: started state syncer
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.886570 [WARN] manager: No servers available
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.886713 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.891842 [INFO] agent: Requesting shutdown
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.892045 [INFO] consul: shutting down client
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.892390 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:47.892574 [INFO] manager: shutting down
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:47.948270 [INFO] agent: Requesting shutdown
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:47.948377 [INFO] consul: shutting down server
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:47.948427 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.023000 [INFO] agent: consul client down
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.023092 [INFO] agent: shutdown complete
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.023156 [INFO] agent: Stopping DNS server 127.0.0.1:17825 (tcp)
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.023374 [INFO] agent: Stopping DNS server 127.0.0.1:17825 (udp)
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.023678 [INFO] agent: Stopping HTTP server 127.0.0.1:17826 (tcp)
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.023986 [INFO] agent: Waiting for endpoints to shut down
TestAgent_InmemKeyrings/client_with_keys - 2019/12/30 18:53:48.024077 [INFO] agent: Endpoints down
=== RUN   TestAgent_InmemKeyrings/ignore_files
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.024561 [WARN] serf: Shutdown without a Leave
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.097928 [INFO] manager: shutting down
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.098606 [INFO] agent: Requesting shutdown
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.098691 [INFO] consul: shutting down server
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.098742 [WARN] serf: Shutdown without a Leave
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.098958 [INFO] agent: consul server down
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.099014 [INFO] agent: shutdown complete
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.099070 [INFO] agent: Stopping DNS server 127.0.0.1:17807 (tcp)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.099278 [INFO] agent: Stopping DNS server 127.0.0.1:17807 (udp)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.099555 [INFO] agent: Stopping HTTP server 127.0.0.1:17808 (tcp)
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.099846 [INFO] agent: Waiting for endpoints to shut down
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.099920 [INFO] agent: Endpoints down
=== RUN   TestAgent_LoadKeyrings/server_with_keys
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.116184 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.116212 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_LoadKeyrings/no_keys - 2019/12/30 18:53:48.116324 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
2019/12/30 18:53:48 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:48 [INFO]  raft: Node at 127.0.0.1:17824 [Leader] entering Leader state
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.183044 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:48.189847 [WARN] agent: Node name "Node d107fdd3-c961-727f-e690-c30ec88c2d1c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:48.190344 [DEBUG] tlsutil: Update with version 1
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.192840 [INFO] consul: cluster leadership acquired
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.194536 [INFO] consul: New leader elected: Node a1c1e5ff-5f9b-9200-bdb5-eec48bc23f4f
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:48.209904 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:48.294708 [WARN] agent: Node name "Node 2cb3dc17-b950-16e1-692c-2a20583e1675" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:48.296359 [DEBUG] tlsutil: Update with version 1
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:48.299185 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.381396 [INFO] manager: shutting down
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.382263 [INFO] agent: consul server down
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.382335 [INFO] agent: shutdown complete
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.382403 [INFO] agent: Stopping DNS server 127.0.0.1:17789 (tcp)
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.382592 [INFO] agent: Stopping DNS server 127.0.0.1:17789 (udp)
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.382776 [INFO] agent: Stopping HTTP server 127.0.0.1:17790 (tcp)
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.383033 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_PUT_GET_DELETE - 2019/12/30 18:53:48.383118 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_PUT_GET_DELETE (6.52s)
=== CONT  TestIntentionsSpecificUpdate_good
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:48.459972 [WARN] agent: Node name "Node 2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:48.460569 [DEBUG] tlsutil: Update with version 1
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:48.462948 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.633336 [INFO] agent: Synced node info
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.633465 [DEBUG] agent: Node info in sync
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.845181 [INFO] agent: Requesting shutdown
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.845277 [INFO] consul: shutting down server
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.845325 [WARN] serf: Shutdown without a Leave
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:48.956761 [WARN] serf: Shutdown without a Leave
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.056374 [INFO] manager: shutting down
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.131703 [INFO] agent: consul server down
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.131817 [INFO] agent: shutdown complete
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.131915 [INFO] agent: Stopping DNS server 127.0.0.1:17819 (tcp)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.132140 [INFO] agent: Stopping DNS server 127.0.0.1:17819 (udp)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.132347 [INFO] agent: Stopping HTTP server 127.0.0.1:17820 (tcp)
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.132588 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.132672 [INFO] agent: Endpoints down
--- PASS: TestIntentionsSpecificDelete_good (3.15s)
=== CONT  TestIntentionsSpecificGet_invalidId
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.139670 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.140045 [ERR] consul: failed to establish leadership: raft is already shutdown
TestIntentionsSpecificDelete_good - 2019/12/30 18:53:49.140290 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:49.193069 [WARN] agent: Node name "Node 87feb3e1-adf9-3179-56ee-f38cbf391f91" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:49.193491 [DEBUG] tlsutil: Update with version 1
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:49.195640 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d107fdd3-c961-727f-e690-c30ec88c2d1c Address:127.0.0.1:17836}]
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:17836 [Follower] entering Follower state (Leader: "")
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.653978 [INFO] serf: EventMemberJoin: Node d107fdd3-c961-727f-e690-c30ec88c2d1c.dc1 127.0.0.1
2019/12/30 18:53:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2cb3dc17-b950-16e1-692c-2a20583e1675 Address:127.0.0.1:17842}]
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.661821 [INFO] serf: EventMemberJoin: Node 2cb3dc17-b950-16e1-692c-2a20583e1675.dc1 127.0.0.1
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:17842 [Follower] entering Follower state (Leader: "")
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.665994 [INFO] serf: EventMemberJoin: Node 2cb3dc17-b950-16e1-692c-2a20583e1675 127.0.0.1
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.667076 [INFO] consul: Adding LAN server Node 2cb3dc17-b950-16e1-692c-2a20583e1675 (Addr: tcp/127.0.0.1:17842) (DC: dc1)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.667304 [INFO] agent: Started DNS server 127.0.0.1:17837 (udp)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.667337 [INFO] consul: Handled member-join event for server "Node 2cb3dc17-b950-16e1-692c-2a20583e1675.dc1" in area "wan"
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.667696 [INFO] agent: Started DNS server 127.0.0.1:17837 (tcp)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.670343 [INFO] agent: Started HTTP server on 127.0.0.1:17838 (tcp)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:49.670442 [INFO] agent: started state syncer
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.672455 [INFO] serf: EventMemberJoin: Node d107fdd3-c961-727f-e690-c30ec88c2d1c 127.0.0.1
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.673392 [INFO] consul: Adding LAN server Node d107fdd3-c961-727f-e690-c30ec88c2d1c (Addr: tcp/127.0.0.1:17836) (DC: dc1)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.674330 [INFO] consul: Handled member-join event for server "Node d107fdd3-c961-727f-e690-c30ec88c2d1c.dc1" in area "wan"
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.674357 [INFO] agent: Started DNS server 127.0.0.1:17831 (tcp)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.674587 [INFO] agent: Started DNS server 127.0.0.1:17831 (udp)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.677282 [INFO] agent: Started HTTP server on 127.0.0.1:17832 (tcp)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:49.677507 [INFO] agent: started state syncer
2019/12/30 18:53:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:17842 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:17836 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93 Address:127.0.0.1:17848}]
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:17848 [Follower] entering Follower state (Leader: "")
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.768197 [INFO] serf: EventMemberJoin: Node 2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93.dc1 127.0.0.1
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.772117 [INFO] serf: EventMemberJoin: Node 2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93 127.0.0.1
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.773418 [INFO] consul: Adding LAN server Node 2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93 (Addr: tcp/127.0.0.1:17848) (DC: dc1)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.773732 [INFO] agent: Started DNS server 127.0.0.1:17843 (udp)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.773895 [INFO] agent: Started DNS server 127.0.0.1:17843 (tcp)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.774113 [INFO] consul: Handled member-join event for server "Node 2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93.dc1" in area "wan"
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.776644 [INFO] agent: Started HTTP server on 127.0.0.1:17844 (tcp)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:49.776746 [INFO] agent: started state syncer
2019/12/30 18:53:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:17848 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:53:49.875844 [DEBUG] consul: Skipping self join check for "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" since the cluster is too small
2019/12/30 18:53:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:50 [INFO]  raft: Node at 127.0.0.1:17848 [Leader] entering Leader state
2019/12/30 18:53:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:50 [INFO]  raft: Node at 127.0.0.1:17842 [Leader] entering Leader state
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:50.483020 [INFO] consul: cluster leadership acquired
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:50.483457 [INFO] consul: New leader elected: Node 2d2153b0-f0d1-f4e6-d94f-ec73e7b43c93
2019/12/30 18:53:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:50 [INFO]  raft: Node at 127.0.0.1:17836 [Leader] entering Leader state
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.486308 [INFO] consul: cluster leadership acquired
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.486734 [INFO] consul: New leader elected: Node d107fdd3-c961-727f-e690-c30ec88c2d1c
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:50.486948 [INFO] consul: cluster leadership acquired
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:50.487247 [INFO] consul: New leader elected: Node 2cb3dc17-b950-16e1-692c-2a20583e1675
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.489348 [INFO] agent: Requesting shutdown
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.489501 [INFO] consul: shutting down server
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.489554 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.489923 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:50.509973 [INFO] agent: Requesting shutdown
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:50.510070 [INFO] consul: shutting down server
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:50.510125 [WARN] serf: Shutdown without a Leave
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:50.510528 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:50.649069 [WARN] serf: Shutdown without a Leave
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.006162 [WARN] serf: Shutdown without a Leave
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.098045 [INFO] manager: shutting down
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.098243 [ERR] consul: failed to wait for barrier: raft is already shutdown
2019/12/30 18:53:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:87feb3e1-adf9-3179-56ee-f38cbf391f91 Address:127.0.0.1:17854}]
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.098561 [INFO] manager: shutting down
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.098666 [ERR] consul: failed to wait for barrier: raft is already shutdown
2019/12/30 18:53:51 [INFO]  raft: Node at 127.0.0.1:17854 [Follower] entering Follower state (Leader: "")
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.099673 [INFO] agent: consul server down
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.099758 [INFO] agent: shutdown complete
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.100090 [INFO] agent: Stopping DNS server 127.0.0.1:17831 (tcp)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.100304 [INFO] agent: Stopping DNS server 127.0.0.1:17831 (udp)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.100511 [INFO] agent: Stopping HTTP server 127.0.0.1:17832 (tcp)
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.100738 [INFO] agent: Waiting for endpoints to shut down
TestAgent_InmemKeyrings/ignore_files - 2019/12/30 18:53:51.100810 [INFO] agent: Endpoints down
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.102400 [INFO] serf: EventMemberJoin: Node 87feb3e1-adf9-3179-56ee-f38cbf391f91.dc1 127.0.0.1
--- PASS: TestAgent_InmemKeyrings (8.31s)
    --- PASS: TestAgent_InmemKeyrings/no_keys (2.52s)
    --- PASS: TestAgent_InmemKeyrings/server_with_keys (2.47s)
    --- PASS: TestAgent_InmemKeyrings/client_with_keys (0.23s)
    --- PASS: TestAgent_InmemKeyrings/ignore_files (3.08s)
=== CONT  TestIntentionsSpecificGet_good
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.106427 [INFO] serf: EventMemberJoin: Node 87feb3e1-adf9-3179-56ee-f38cbf391f91 127.0.0.1
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.107320 [INFO] agent: consul server down
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.107392 [INFO] agent: shutdown complete
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.107451 [INFO] agent: Stopping DNS server 127.0.0.1:17837 (tcp)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.107608 [INFO] agent: Stopping DNS server 127.0.0.1:17837 (udp)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.107794 [INFO] agent: Stopping HTTP server 127.0.0.1:17838 (tcp)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.108001 [INFO] agent: Started DNS server 127.0.0.1:17849 (udp)
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.108022 [INFO] agent: Waiting for endpoints to shut down
TestAgent_LoadKeyrings/server_with_keys - 2019/12/30 18:53:51.108087 [INFO] agent: Endpoints down
=== RUN   TestAgent_LoadKeyrings/client_with_keys
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.108449 [INFO] consul: Handled member-join event for server "Node 87feb3e1-adf9-3179-56ee-f38cbf391f91.dc1" in area "wan"
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.108577 [INFO] consul: Adding LAN server Node 87feb3e1-adf9-3179-56ee-f38cbf391f91 (Addr: tcp/127.0.0.1:17854) (DC: dc1)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.108940 [INFO] agent: Started DNS server 127.0.0.1:17849 (tcp)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.122262 [INFO] agent: Started HTTP server on 127.0.0.1:17850 (tcp)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.122365 [INFO] agent: started state syncer
2019/12/30 18:53:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:51 [INFO]  raft: Node at 127.0.0.1:17854 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsSpecificGet_good - 2019/12/30 18:53:51.232961 [WARN] agent: Node name "Node 368f5971-2e2f-099a-e0a1-864c660d76a3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsSpecificGet_good - 2019/12/30 18:53:51.233441 [DEBUG] tlsutil: Update with version 1
TestIntentionsSpecificGet_good - 2019/12/30 18:53:51.235810 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.247101 [WARN] agent: Node name "Node ec3d9d5d-fc84-4b55-7235-ed8e0b7b17cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.248167 [DEBUG] tlsutil: Update with version 1
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.256649 [INFO] serf: EventMemberJoin: Node ec3d9d5d-fc84-4b55-7235-ed8e0b7b17cf 127.0.0.1
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.258789 [INFO] agent: Started DNS server 127.0.0.1:17855 (tcp)
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.260203 [INFO] agent: Started DNS server 127.0.0.1:17855 (udp)
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.267372 [INFO] agent: Started HTTP server on 127.0.0.1:17856 (tcp)
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.267945 [INFO] agent: started state syncer
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.268252 [WARN] manager: No servers available
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.268859 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.276011 [INFO] agent: Requesting shutdown
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.277427 [INFO] consul: shutting down client
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.277718 [WARN] serf: Shutdown without a Leave
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.277892 [INFO] manager: shutting down
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.439711 [INFO] agent: consul client down
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.439808 [INFO] agent: shutdown complete
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.439874 [INFO] agent: Stopping DNS server 127.0.0.1:17855 (tcp)
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.440082 [INFO] agent: Stopping DNS server 127.0.0.1:17855 (udp)
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.440297 [INFO] agent: Stopping HTTP server 127.0.0.1:17856 (tcp)
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.440551 [INFO] agent: Waiting for endpoints to shut down
TestAgent_LoadKeyrings/client_with_keys - 2019/12/30 18:53:51.440631 [INFO] agent: Endpoints down
--- PASS: TestAgent_LoadKeyrings (6.13s)
    --- PASS: TestAgent_LoadKeyrings/no_keys (2.79s)
    --- PASS: TestAgent_LoadKeyrings/server_with_keys (3.01s)
    --- PASS: TestAgent_LoadKeyrings/client_with_keys (0.33s)
=== CONT  TestIntentionsCreate_noBody
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:51.445178 [INFO] agent: Synced node info
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:51.445334 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsCreate_noBody - 2019/12/30 18:53:51.589516 [WARN] agent: Node name "Node 6e689049-fa2e-95d8-0af8-cd934d810dac" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsCreate_noBody - 2019/12/30 18:53:51.590409 [DEBUG] tlsutil: Update with version 1
TestIntentionsCreate_noBody - 2019/12/30 18:53:51.592865 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:53:51.745233 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:53:51.745327 [DEBUG] agent: Node info in sync
2019/12/30 18:53:51 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:51 [INFO]  raft: Node at 127.0.0.1:17854 [Leader] entering Leader state
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.875586 [INFO] consul: cluster leadership acquired
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:51.876028 [INFO] consul: New leader elected: Node 87feb3e1-adf9-3179-56ee-f38cbf391f91
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:51.983708 [INFO] agent: Requesting shutdown
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:51.983815 [INFO] consul: shutting down server
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:51.983865 [WARN] serf: Shutdown without a Leave
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.035378 [INFO] agent: Requesting shutdown
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.035502 [INFO] consul: shutting down server
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.035554 [WARN] serf: Shutdown without a Leave
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.084286 [WARN] serf: Shutdown without a Leave
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.164647 [WARN] serf: Shutdown without a Leave
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.164733 [INFO] manager: shutting down
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.164762 [ERR] consul: failed to establish leadership: raft is already shutdown
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.164933 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165081 [INFO] agent: consul server down
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165156 [INFO] agent: shutdown complete
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165221 [INFO] agent: Stopping DNS server 127.0.0.1:17843 (tcp)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165391 [INFO] agent: Stopping DNS server 127.0.0.1:17843 (udp)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165573 [INFO] agent: Stopping HTTP server 127.0.0.1:17844 (tcp)
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165895 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsSpecificUpdate_good - 2019/12/30 18:53:52.165977 [INFO] agent: Endpoints down
--- PASS: TestIntentionsSpecificUpdate_good (3.78s)
=== CONT  TestIntentionsCreate_good
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.248039 [INFO] manager: shutting down
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250103 [INFO] agent: consul server down
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250188 [INFO] agent: shutdown complete
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250283 [INFO] agent: Stopping DNS server 127.0.0.1:17849 (tcp)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250465 [INFO] agent: Stopping DNS server 127.0.0.1:17849 (udp)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250669 [INFO] agent: Stopping HTTP server 127.0.0.1:17850 (tcp)
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250897 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.250977 [INFO] agent: Endpoints down
--- PASS: TestIntentionsSpecificGet_invalidId (3.12s)
=== CONT  TestIntentionsCheck_noDestination
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.263195 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.263346 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestIntentionsSpecificGet_invalidId - 2019/12/30 18:53:52.263397 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsCreate_good - 2019/12/30 18:53:52.291068 [WARN] agent: Node name "Node 44630ed3-4cd5-05fa-115d-1d3764d8077d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsCreate_good - 2019/12/30 18:53:52.291972 [DEBUG] tlsutil: Update with version 1
TestIntentionsCreate_good - 2019/12/30 18:53:52.294919 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:368f5971-2e2f-099a-e0a1-864c660d76a3 Address:127.0.0.1:17866}]
2019/12/30 18:53:52 [INFO]  raft: Node at 127.0.0.1:17866 [Follower] entering Follower state (Leader: "")
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.362172 [INFO] serf: EventMemberJoin: Node 368f5971-2e2f-099a-e0a1-864c660d76a3.dc1 127.0.0.1
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.366149 [INFO] serf: EventMemberJoin: Node 368f5971-2e2f-099a-e0a1-864c660d76a3 127.0.0.1
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.369117 [INFO] consul: Adding LAN server Node 368f5971-2e2f-099a-e0a1-864c660d76a3 (Addr: tcp/127.0.0.1:17866) (DC: dc1)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.369725 [INFO] consul: Handled member-join event for server "Node 368f5971-2e2f-099a-e0a1-864c660d76a3.dc1" in area "wan"
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.370524 [INFO] agent: Started DNS server 127.0.0.1:17861 (udp)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.370701 [INFO] agent: Started DNS server 127.0.0.1:17861 (tcp)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.373080 [INFO] agent: Started HTTP server on 127.0.0.1:17862 (tcp)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:52.373327 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsCheck_noDestination - 2019/12/30 18:53:52.407035 [WARN] agent: Node name "Node c314b11c-470a-7746-5c28-085ae74f148d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsCheck_noDestination - 2019/12/30 18:53:52.407477 [DEBUG] tlsutil: Update with version 1
TestIntentionsCheck_noDestination - 2019/12/30 18:53:52.413964 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:52 [INFO]  raft: Node at 127.0.0.1:17866 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6e689049-fa2e-95d8-0af8-cd934d810dac Address:127.0.0.1:17872}]
2019/12/30 18:53:52 [INFO]  raft: Node at 127.0.0.1:17872 [Follower] entering Follower state (Leader: "")
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.585145 [INFO] serf: EventMemberJoin: Node 6e689049-fa2e-95d8-0af8-cd934d810dac.dc1 127.0.0.1
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.601760 [INFO] serf: EventMemberJoin: Node 6e689049-fa2e-95d8-0af8-cd934d810dac 127.0.0.1
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.607984 [INFO] consul: Handled member-join event for server "Node 6e689049-fa2e-95d8-0af8-cd934d810dac.dc1" in area "wan"
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.624498 [INFO] agent: Started DNS server 127.0.0.1:17867 (udp)
2019/12/30 18:53:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:52 [INFO]  raft: Node at 127.0.0.1:17872 [Candidate] entering Candidate state in term 2
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.631078 [INFO] consul: Adding LAN server Node 6e689049-fa2e-95d8-0af8-cd934d810dac (Addr: tcp/127.0.0.1:17872) (DC: dc1)
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.632900 [INFO] agent: Started DNS server 127.0.0.1:17867 (tcp)
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.649044 [INFO] agent: Started HTTP server on 127.0.0.1:17868 (tcp)
TestIntentionsCreate_noBody - 2019/12/30 18:53:52.649301 [INFO] agent: started state syncer
jones - 2019/12/30 18:53:54.502172 [DEBUG] consul: Skipping self join check for "Node 5122c9d8-8979-c841-956f-094a90e62880" since the cluster is too small
2019/12/30 18:53:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:54 [INFO]  raft: Node at 127.0.0.1:17866 [Leader] entering Leader state
TestIntentionsSpecificGet_good - 2019/12/30 18:53:54.504786 [INFO] consul: cluster leadership acquired
TestIntentionsSpecificGet_good - 2019/12/30 18:53:54.505186 [INFO] consul: New leader elected: Node 368f5971-2e2f-099a-e0a1-864c660d76a3
2019/12/30 18:53:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:54 [INFO]  raft: Node at 127.0.0.1:17872 [Leader] entering Leader state
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.666527 [INFO] consul: cluster leadership acquired
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.667065 [INFO] consul: New leader elected: Node 6e689049-fa2e-95d8-0af8-cd934d810dac
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.696975 [INFO] agent: Requesting shutdown
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.697089 [INFO] consul: shutting down server
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.697150 [WARN] serf: Shutdown without a Leave
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.697309 [ERR] agent: failed to sync remote state: No cluster leader
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.776003 [WARN] serf: Shutdown without a Leave
2019/12/30 18:53:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:44630ed3-4cd5-05fa-115d-1d3764d8077d Address:127.0.0.1:17878}]
2019/12/30 18:53:54 [INFO]  raft: Node at 127.0.0.1:17878 [Follower] entering Follower state (Leader: "")
TestIntentionsCreate_good - 2019/12/30 18:53:54.894080 [INFO] serf: EventMemberJoin: Node 44630ed3-4cd5-05fa-115d-1d3764d8077d.dc1 127.0.0.1
TestIntentionsCreate_noBody - 2019/12/30 18:53:54.895355 [INFO] manager: shutting down
TestIntentionsCreate_good - 2019/12/30 18:53:54.898850 [INFO] serf: EventMemberJoin: Node 44630ed3-4cd5-05fa-115d-1d3764d8077d 127.0.0.1
TestIntentionsCreate_good - 2019/12/30 18:53:54.899980 [INFO] consul: Adding LAN server Node 44630ed3-4cd5-05fa-115d-1d3764d8077d (Addr: tcp/127.0.0.1:17878) (DC: dc1)
TestIntentionsCreate_good - 2019/12/30 18:53:54.900639 [INFO] consul: Handled member-join event for server "Node 44630ed3-4cd5-05fa-115d-1d3764d8077d.dc1" in area "wan"
TestIntentionsCreate_good - 2019/12/30 18:53:54.903241 [INFO] agent: Started DNS server 127.0.0.1:17873 (udp)
TestIntentionsCreate_good - 2019/12/30 18:53:54.903334 [INFO] agent: Started DNS server 127.0.0.1:17873 (tcp)
TestIntentionsCreate_good - 2019/12/30 18:53:54.905907 [INFO] agent: Started HTTP server on 127.0.0.1:17874 (tcp)
TestIntentionsCreate_good - 2019/12/30 18:53:54.906003 [INFO] agent: started state syncer
2019/12/30 18:53:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:54 [INFO]  raft: Node at 127.0.0.1:17878 [Candidate] entering Candidate state in term 2
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.024789 [INFO] agent: Synced node info
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.025922 [INFO] agent: Requesting shutdown
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.026021 [INFO] consul: shutting down server
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.026069 [WARN] serf: Shutdown without a Leave
2019/12/30 18:53:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c314b11c-470a-7746-5c28-085ae74f148d Address:127.0.0.1:17884}]
2019/12/30 18:53:55 [INFO]  raft: Node at 127.0.0.1:17884 [Follower] entering Follower state (Leader: "")
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.031252 [INFO] serf: EventMemberJoin: Node c314b11c-470a-7746-5c28-085ae74f148d.dc1 127.0.0.1
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.035727 [INFO] serf: EventMemberJoin: Node c314b11c-470a-7746-5c28-085ae74f148d 127.0.0.1
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.037172 [INFO] consul: Adding LAN server Node c314b11c-470a-7746-5c28-085ae74f148d (Addr: tcp/127.0.0.1:17884) (DC: dc1)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.037680 [INFO] consul: Handled member-join event for server "Node c314b11c-470a-7746-5c28-085ae74f148d.dc1" in area "wan"
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.038916 [INFO] agent: Started DNS server 127.0.0.1:17879 (tcp)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.039441 [INFO] agent: Started DNS server 127.0.0.1:17879 (udp)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.041835 [INFO] agent: Started HTTP server on 127.0.0.1:17880 (tcp)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.041929 [INFO] agent: started state syncer
2019/12/30 18:53:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:55 [INFO]  raft: Node at 127.0.0.1:17884 [Candidate] entering Candidate state in term 2
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.120073 [ERR] agent: failed to sync remote state: No cluster leader
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.131790 [INFO] agent: consul server down
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.131876 [INFO] agent: shutdown complete
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.131945 [INFO] agent: Stopping DNS server 127.0.0.1:17867 (tcp)
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.132114 [INFO] agent: Stopping DNS server 127.0.0.1:17867 (udp)
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.132332 [INFO] agent: Stopping HTTP server 127.0.0.1:17868 (tcp)
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.132563 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.132638 [INFO] agent: Endpoints down
--- PASS: TestIntentionsCreate_noBody (3.69s)
=== CONT  TestIntentionsCheck_noSource
TestIntentionsCreate_noBody - 2019/12/30 18:53:55.133276 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.135229 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:53:55.201034 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:53:55.201148 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/30 18:53:55.201190 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsCheck_noSource - 2019/12/30 18:53:55.207983 [WARN] agent: Node name "Node db366e1f-adb7-213b-9285-8dba414d4cc9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsCheck_noSource - 2019/12/30 18:53:55.208655 [DEBUG] tlsutil: Update with version 1
TestIntentionsCheck_noSource - 2019/12/30 18:53:55.211218 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.223755 [INFO] manager: shutting down
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.225235 [INFO] agent: consul server down
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.225316 [INFO] agent: shutdown complete
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.225379 [INFO] agent: Stopping DNS server 127.0.0.1:17861 (tcp)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.225563 [INFO] agent: Stopping DNS server 127.0.0.1:17861 (udp)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.225756 [INFO] agent: Stopping HTTP server 127.0.0.1:17862 (tcp)
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.226043 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.226130 [INFO] agent: Endpoints down
--- PASS: TestIntentionsSpecificGet_good (4.12s)
=== CONT  TestIntentionsCheck_basic
TestIntentionsSpecificGet_good - 2019/12/30 18:53:55.226960 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsCheck_basic - 2019/12/30 18:53:55.288710 [WARN] agent: Node name "Node 4fc77551-9ba9-7892-14dd-2f9e5d41d1a3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsCheck_basic - 2019/12/30 18:53:55.289108 [DEBUG] tlsutil: Update with version 1
TestIntentionsCheck_basic - 2019/12/30 18:53:55.291314 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:55 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:55 [INFO]  raft: Node at 127.0.0.1:17878 [Leader] entering Leader state
TestIntentionsCreate_good - 2019/12/30 18:53:55.761958 [INFO] consul: cluster leadership acquired
TestIntentionsCreate_good - 2019/12/30 18:53:55.762464 [INFO] consul: New leader elected: Node 44630ed3-4cd5-05fa-115d-1d3764d8077d
2019/12/30 18:53:55 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:55 [INFO]  raft: Node at 127.0.0.1:17884 [Leader] entering Leader state
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.906705 [INFO] consul: cluster leadership acquired
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.907149 [INFO] consul: New leader elected: Node c314b11c-470a-7746-5c28-085ae74f148d
jones - 2019/12/30 18:53:55.908617 [DEBUG] consul: Skipping self join check for "Node a8b3e297-b53a-bcd0-efda-5addcd938805" since the cluster is too small
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.974679 [INFO] agent: Requesting shutdown
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.974789 [INFO] consul: shutting down server
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.974847 [WARN] serf: Shutdown without a Leave
TestIntentionsCheck_noDestination - 2019/12/30 18:53:55.975301 [ERR] agent: failed to sync remote state: No cluster leader
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.102856 [WARN] serf: Shutdown without a Leave
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.176035 [INFO] manager: shutting down
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.224593 [ERR] agent: failed to sync remote state: No cluster leader
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.323078 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.323370 [INFO] agent: consul server down
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.323433 [INFO] agent: shutdown complete
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.323500 [INFO] agent: Stopping DNS server 127.0.0.1:17879 (tcp)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.323677 [INFO] agent: Stopping DNS server 127.0.0.1:17879 (udp)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.323870 [INFO] agent: Stopping HTTP server 127.0.0.1:17880 (tcp)
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.324748 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsCheck_noDestination - 2019/12/30 18:53:56.324962 [INFO] agent: Endpoints down
--- PASS: TestIntentionsCheck_noDestination (4.07s)
=== CONT  TestIntentionsMatch_noName
TestIntentionsCreate_good - 2019/12/30 18:53:56.327273 [INFO] agent: Requesting shutdown
TestIntentionsCreate_good - 2019/12/30 18:53:56.327369 [INFO] consul: shutting down server
TestIntentionsCreate_good - 2019/12/30 18:53:56.327285 [INFO] agent: Synced node info
TestIntentionsCreate_good - 2019/12/30 18:53:56.327418 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsMatch_noName - 2019/12/30 18:53:56.392804 [WARN] agent: Node name "Node 21128e91-9ef2-456f-2916-3f1c778d3e70" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsMatch_noName - 2019/12/30 18:53:56.393397 [DEBUG] tlsutil: Update with version 1
TestIntentionsMatch_noName - 2019/12/30 18:53:56.396831 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestIntentionsCreate_good - 2019/12/30 18:53:56.518250 [WARN] serf: Shutdown without a Leave
TestIntentionsCreate_good - 2019/12/30 18:53:56.683501 [INFO] manager: shutting down
2019/12/30 18:53:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:db366e1f-adb7-213b-9285-8dba414d4cc9 Address:127.0.0.1:17890}]
2019/12/30 18:53:56 [INFO]  raft: Node at 127.0.0.1:17890 [Follower] entering Follower state (Leader: "")
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.694977 [INFO] serf: EventMemberJoin: Node db366e1f-adb7-213b-9285-8dba414d4cc9.dc1 127.0.0.1
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.702794 [INFO] serf: EventMemberJoin: Node db366e1f-adb7-213b-9285-8dba414d4cc9 127.0.0.1
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.704074 [INFO] consul: Adding LAN server Node db366e1f-adb7-213b-9285-8dba414d4cc9 (Addr: tcp/127.0.0.1:17890) (DC: dc1)
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.704808 [INFO] consul: Handled member-join event for server "Node db366e1f-adb7-213b-9285-8dba414d4cc9.dc1" in area "wan"
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.706154 [INFO] agent: Started DNS server 127.0.0.1:17885 (tcp)
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.706656 [INFO] agent: Started DNS server 127.0.0.1:17885 (udp)
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.709236 [INFO] agent: Started HTTP server on 127.0.0.1:17886 (tcp)
TestIntentionsCheck_noSource - 2019/12/30 18:53:56.709324 [INFO] agent: started state syncer
2019/12/30 18:53:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:56 [INFO]  raft: Node at 127.0.0.1:17890 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4fc77551-9ba9-7892-14dd-2f9e5d41d1a3 Address:127.0.0.1:17896}]
2019/12/30 18:53:56 [INFO]  raft: Node at 127.0.0.1:17896 [Follower] entering Follower state (Leader: "")
TestIntentionsCheck_basic - 2019/12/30 18:53:56.797261 [INFO] serf: EventMemberJoin: Node 4fc77551-9ba9-7892-14dd-2f9e5d41d1a3.dc1 127.0.0.1
TestIntentionsCheck_basic - 2019/12/30 18:53:56.800564 [INFO] serf: EventMemberJoin: Node 4fc77551-9ba9-7892-14dd-2f9e5d41d1a3 127.0.0.1
TestIntentionsCheck_basic - 2019/12/30 18:53:56.801158 [INFO] consul: Handled member-join event for server "Node 4fc77551-9ba9-7892-14dd-2f9e5d41d1a3.dc1" in area "wan"
TestIntentionsCheck_basic - 2019/12/30 18:53:56.802075 [INFO] agent: Started DNS server 127.0.0.1:17891 (udp)
TestIntentionsCheck_basic - 2019/12/30 18:53:56.802270 [INFO] agent: Started DNS server 127.0.0.1:17891 (tcp)
TestIntentionsCheck_basic - 2019/12/30 18:53:56.802974 [INFO] consul: Adding LAN server Node 4fc77551-9ba9-7892-14dd-2f9e5d41d1a3 (Addr: tcp/127.0.0.1:17896) (DC: dc1)
TestIntentionsCheck_basic - 2019/12/30 18:53:56.805082 [INFO] agent: Started HTTP server on 127.0.0.1:17892 (tcp)
TestIntentionsCheck_basic - 2019/12/30 18:53:56.805346 [INFO] agent: started state syncer
2019/12/30 18:53:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:56 [INFO]  raft: Node at 127.0.0.1:17896 [Candidate] entering Candidate state in term 2
TestIntentionsCreate_good - 2019/12/30 18:53:57.150018 [INFO] agent: consul server down
TestIntentionsCreate_good - 2019/12/30 18:53:57.150097 [INFO] agent: shutdown complete
TestIntentionsCreate_good - 2019/12/30 18:53:57.150158 [INFO] agent: Stopping DNS server 127.0.0.1:17873 (tcp)
TestIntentionsCreate_good - 2019/12/30 18:53:57.150318 [INFO] agent: Stopping DNS server 127.0.0.1:17873 (udp)
TestIntentionsCreate_good - 2019/12/30 18:53:57.150460 [INFO] agent: Stopping HTTP server 127.0.0.1:17874 (tcp)
TestIntentionsCreate_good - 2019/12/30 18:53:57.150642 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsCreate_good - 2019/12/30 18:53:57.150707 [INFO] agent: Endpoints down
--- PASS: TestIntentionsCreate_good (4.98s)
=== CONT  TestIntentionsMatch_byInvalid
TestIntentionsCreate_good - 2019/12/30 18:53:57.151562 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestIntentionsCreate_good - 2019/12/30 18:53:57.151754 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:57.257876 [WARN] agent: Node name "Node 70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:57.258418 [DEBUG] tlsutil: Update with version 1
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:57.261333 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:57 [INFO]  raft: Node at 127.0.0.1:17890 [Leader] entering Leader state
2019/12/30 18:53:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:21128e91-9ef2-456f-2916-3f1c778d3e70 Address:127.0.0.1:17902}]
TestIntentionsCheck_noSource - 2019/12/30 18:53:57.974600 [INFO] consul: cluster leadership acquired
TestIntentionsCheck_noSource - 2019/12/30 18:53:57.974995 [INFO] consul: New leader elected: Node db366e1f-adb7-213b-9285-8dba414d4cc9
2019/12/30 18:53:57 [INFO]  raft: Node at 127.0.0.1:17902 [Follower] entering Follower state (Leader: "")
TestIntentionsMatch_noName - 2019/12/30 18:53:57.978036 [INFO] serf: EventMemberJoin: Node 21128e91-9ef2-456f-2916-3f1c778d3e70.dc1 127.0.0.1
TestIntentionsMatch_noName - 2019/12/30 18:53:57.981270 [INFO] serf: EventMemberJoin: Node 21128e91-9ef2-456f-2916-3f1c778d3e70 127.0.0.1
TestIntentionsMatch_noName - 2019/12/30 18:53:57.982363 [INFO] agent: Started DNS server 127.0.0.1:17897 (udp)
TestIntentionsMatch_noName - 2019/12/30 18:53:57.982577 [INFO] consul: Handled member-join event for server "Node 21128e91-9ef2-456f-2916-3f1c778d3e70.dc1" in area "wan"
TestIntentionsMatch_noName - 2019/12/30 18:53:57.982686 [INFO] consul: Adding LAN server Node 21128e91-9ef2-456f-2916-3f1c778d3e70 (Addr: tcp/127.0.0.1:17902) (DC: dc1)
TestIntentionsMatch_noName - 2019/12/30 18:53:57.994480 [INFO] agent: Started DNS server 127.0.0.1:17897 (tcp)
TestIntentionsMatch_noName - 2019/12/30 18:53:57.996786 [INFO] agent: Started HTTP server on 127.0.0.1:17898 (tcp)
TestIntentionsMatch_noName - 2019/12/30 18:53:57.996887 [INFO] agent: started state syncer
2019/12/30 18:53:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:58 [INFO]  raft: Node at 127.0.0.1:17902 [Candidate] entering Candidate state in term 2
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.047984 [INFO] agent: Requesting shutdown
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.048095 [INFO] consul: shutting down server
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.048149 [WARN] serf: Shutdown without a Leave
2019/12/30 18:53:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:58 [INFO]  raft: Node at 127.0.0.1:17896 [Leader] entering Leader state
TestIntentionsCheck_basic - 2019/12/30 18:53:58.091628 [INFO] consul: cluster leadership acquired
TestIntentionsCheck_basic - 2019/12/30 18:53:58.092029 [INFO] consul: New leader elected: Node 4fc77551-9ba9-7892-14dd-2f9e5d41d1a3
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.223750 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:53:58.224341 [DEBUG] consul: Skipping self join check for "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" since the cluster is too small
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.314911 [INFO] manager: shutting down
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.390839 [INFO] agent: consul server down
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.390912 [INFO] agent: shutdown complete
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.390968 [INFO] agent: Stopping DNS server 127.0.0.1:17885 (tcp)
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.391114 [INFO] agent: Stopping DNS server 127.0.0.1:17885 (udp)
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.391265 [INFO] agent: Stopping HTTP server 127.0.0.1:17886 (tcp)
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.391452 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.391516 [INFO] agent: Endpoints down
--- PASS: TestIntentionsCheck_noSource (3.26s)
=== CONT  TestIntentionsMatch_noBy
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.395754 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.395897 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestIntentionsCheck_noSource - 2019/12/30 18:53:58.395943 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsMatch_noBy - 2019/12/30 18:53:58.470361 [WARN] agent: Node name "Node 7bbd0d32-6302-34f7-165c-0d7c5b63077d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsMatch_noBy - 2019/12/30 18:53:58.471138 [DEBUG] tlsutil: Update with version 1
TestIntentionsMatch_noBy - 2019/12/30 18:53:58.473620 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestIntentionsCheck_basic - 2019/12/30 18:53:58.483679 [INFO] agent: Synced node info
TestIntentionsCheck_basic - 2019/12/30 18:53:58.483805 [DEBUG] agent: Node info in sync
2019/12/30 18:53:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4 Address:127.0.0.1:17908}]
2019/12/30 18:53:58 [INFO]  raft: Node at 127.0.0.1:17908 [Follower] entering Follower state (Leader: "")
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.613805 [INFO] serf: EventMemberJoin: Node 70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4.dc1 127.0.0.1
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.616857 [INFO] serf: EventMemberJoin: Node 70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4 127.0.0.1
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.617614 [INFO] consul: Handled member-join event for server "Node 70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4.dc1" in area "wan"
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.617886 [INFO] consul: Adding LAN server Node 70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4 (Addr: tcp/127.0.0.1:17908) (DC: dc1)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.618296 [INFO] agent: Started DNS server 127.0.0.1:17903 (udp)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.618578 [INFO] agent: Started DNS server 127.0.0.1:17903 (tcp)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.621230 [INFO] agent: Started HTTP server on 127.0.0.1:17904 (tcp)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:58.621330 [INFO] agent: started state syncer
2019/12/30 18:53:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:58 [INFO]  raft: Node at 127.0.0.1:17908 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:58 [INFO]  raft: Node at 127.0.0.1:17902 [Leader] entering Leader state
TestIntentionsMatch_noName - 2019/12/30 18:53:58.682199 [INFO] consul: cluster leadership acquired
TestIntentionsMatch_noName - 2019/12/30 18:53:58.682658 [INFO] consul: New leader elected: Node 21128e91-9ef2-456f-2916-3f1c778d3e70
TestIntentionsMatch_noName - 2019/12/30 18:53:58.846686 [INFO] agent: Requesting shutdown
TestIntentionsMatch_noName - 2019/12/30 18:53:58.846799 [INFO] consul: shutting down server
TestIntentionsMatch_noName - 2019/12/30 18:53:58.846848 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_noName - 2019/12/30 18:53:58.964681 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_noName - 2019/12/30 18:53:59.035898 [INFO] manager: shutting down
TestIntentionsMatch_noName - 2019/12/30 18:53:59.131719 [INFO] agent: consul server down
TestIntentionsMatch_noName - 2019/12/30 18:53:59.131794 [INFO] agent: shutdown complete
TestIntentionsMatch_noName - 2019/12/30 18:53:59.131852 [INFO] agent: Stopping DNS server 127.0.0.1:17897 (tcp)
TestIntentionsMatch_noName - 2019/12/30 18:53:59.132004 [INFO] agent: Stopping DNS server 127.0.0.1:17897 (udp)
TestIntentionsMatch_noName - 2019/12/30 18:53:59.132159 [INFO] agent: Stopping HTTP server 127.0.0.1:17898 (tcp)
TestIntentionsMatch_noName - 2019/12/30 18:53:59.132398 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsMatch_noName - 2019/12/30 18:53:59.132470 [INFO] agent: Endpoints down
--- PASS: TestIntentionsMatch_noName (2.81s)
=== CONT  TestIntentionsMatch_basic
TestIntentionsMatch_noName - 2019/12/30 18:53:59.142556 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestIntentionsMatch_noName - 2019/12/30 18:53:59.142736 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestIntentionsMatch_noName - 2019/12/30 18:53:59.142788 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsMatch_basic - 2019/12/30 18:53:59.205412 [WARN] agent: Node name "Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsMatch_basic - 2019/12/30 18:53:59.206048 [DEBUG] tlsutil: Update with version 1
TestIntentionsMatch_basic - 2019/12/30 18:53:59.209217 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:59 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:59 [INFO]  raft: Node at 127.0.0.1:17908 [Leader] entering Leader state
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.400953 [INFO] consul: cluster leadership acquired
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.401371 [INFO] consul: New leader elected: Node 70d6aac1-d8d1-5d5d-4c86-d68e2287a1c4
2019/12/30 18:53:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7bbd0d32-6302-34f7-165c-0d7c5b63077d Address:127.0.0.1:17914}]
TestIntentionsCheck_basic - 2019/12/30 18:53:59.483633 [INFO] agent: Requesting shutdown
TestIntentionsCheck_basic - 2019/12/30 18:53:59.483712 [INFO] consul: shutting down server
TestIntentionsCheck_basic - 2019/12/30 18:53:59.483756 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.485208 [INFO] serf: EventMemberJoin: Node 7bbd0d32-6302-34f7-165c-0d7c5b63077d.dc1 127.0.0.1
2019/12/30 18:53:59 [INFO]  raft: Node at 127.0.0.1:17914 [Follower] entering Follower state (Leader: "")
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.489285 [INFO] serf: EventMemberJoin: Node 7bbd0d32-6302-34f7-165c-0d7c5b63077d 127.0.0.1
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.493846 [INFO] consul: Adding LAN server Node 7bbd0d32-6302-34f7-165c-0d7c5b63077d (Addr: tcp/127.0.0.1:17914) (DC: dc1)
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.495045 [INFO] agent: Started DNS server 127.0.0.1:17909 (tcp)
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.495338 [INFO] consul: Handled member-join event for server "Node 7bbd0d32-6302-34f7-165c-0d7c5b63077d.dc1" in area "wan"
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.495947 [INFO] agent: Started DNS server 127.0.0.1:17909 (udp)
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.498644 [INFO] agent: Started HTTP server on 127.0.0.1:17910 (tcp)
TestIntentionsMatch_noBy - 2019/12/30 18:53:59.498934 [INFO] agent: started state syncer
2019/12/30 18:53:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:59 [INFO]  raft: Node at 127.0.0.1:17914 [Candidate] entering Candidate state in term 2
TestIntentionsCheck_basic - 2019/12/30 18:53:59.573038 [WARN] serf: Shutdown without a Leave
TestIntentionsCheck_basic - 2019/12/30 18:53:59.641271 [INFO] manager: shutting down
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.672968 [INFO] agent: Requesting shutdown
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.673068 [INFO] consul: shutting down server
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.673120 [WARN] serf: Shutdown without a Leave
TestIntentionsCheck_basic - 2019/12/30 18:53:59.733332 [INFO] agent: consul server down
TestIntentionsCheck_basic - 2019/12/30 18:53:59.733403 [INFO] agent: shutdown complete
TestIntentionsCheck_basic - 2019/12/30 18:53:59.733466 [INFO] agent: Stopping DNS server 127.0.0.1:17891 (tcp)
TestIntentionsCheck_basic - 2019/12/30 18:53:59.765310 [INFO] agent: Stopping DNS server 127.0.0.1:17891 (udp)
TestIntentionsCheck_basic - 2019/12/30 18:53:59.765714 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestIntentionsCheck_basic - 2019/12/30 18:53:59.766205 [INFO] agent: Stopping HTTP server 127.0.0.1:17892 (tcp)
TestIntentionsCheck_basic - 2019/12/30 18:53:59.767407 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsCheck_basic - 2019/12/30 18:53:59.767474 [INFO] agent: Endpoints down
--- PASS: TestIntentionsCheck_basic (4.54s)
=== CONT  TestIntentionsList_values
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.825445 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.903649 [INFO] agent: Synced node info
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.903813 [DEBUG] agent: Node info in sync
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.904743 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.905001 [ERR] consul: failed to establish leadership: raft is already shutdown
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.905107 [INFO] manager: shutting down
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.905492 [INFO] agent: consul server down
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.905572 [INFO] agent: shutdown complete
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.905642 [INFO] agent: Stopping DNS server 127.0.0.1:17903 (tcp)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.905817 [INFO] agent: Stopping DNS server 127.0.0.1:17903 (udp)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.906013 [INFO] agent: Stopping HTTP server 127.0.0.1:17904 (tcp)
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.906281 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.906345 [INFO] agent: Endpoints down
--- PASS: TestIntentionsMatch_byInvalid (2.76s)
=== CONT  TestIntentionsList_empty
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.912216 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestIntentionsMatch_byInvalid - 2019/12/30 18:53:59.912334 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsList_values - 2019/12/30 18:54:00.003449 [WARN] agent: Node name "Node d4ff3a98-67a8-9b5b-56f9-71c424c8277a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsList_values - 2019/12/30 18:54:00.003870 [DEBUG] tlsutil: Update with version 1
TestIntentionsList_values - 2019/12/30 18:54:00.006020 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestIntentionsList_empty - 2019/12/30 18:54:00.063182 [WARN] agent: Node name "Node cebb5d05-17cf-7c88-d10d-373540762901" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestIntentionsList_empty - 2019/12/30 18:54:00.064830 [DEBUG] tlsutil: Update with version 1
TestIntentionsList_empty - 2019/12/30 18:54:00.070605 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:00 [INFO]  raft: Node at 127.0.0.1:17914 [Leader] entering Leader state
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.299975 [INFO] consul: cluster leadership acquired
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.300472 [INFO] consul: New leader elected: Node 7bbd0d32-6302-34f7-165c-0d7c5b63077d
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.357709 [INFO] agent: Requesting shutdown
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.357807 [INFO] consul: shutting down server
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.357870 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.358156 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:54:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e9bbbb3f-ca96-38b9-c199-84ef0763f27f Address:127.0.0.1:17920}]
2019/12/30 18:54:00 [INFO]  raft: Node at 127.0.0.1:17920 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:54:00.399935 [DEBUG] consul: Skipping self join check for "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" since the cluster is too small
TestIntentionsMatch_basic - 2019/12/30 18:54:00.402498 [INFO] serf: EventMemberJoin: Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f.dc1 127.0.0.1
TestIntentionsMatch_basic - 2019/12/30 18:54:00.406468 [INFO] serf: EventMemberJoin: Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f 127.0.0.1
TestIntentionsMatch_basic - 2019/12/30 18:54:00.407710 [INFO] consul: Adding LAN server Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f (Addr: tcp/127.0.0.1:17920) (DC: dc1)
TestIntentionsMatch_basic - 2019/12/30 18:54:00.408290 [INFO] consul: Handled member-join event for server "Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f.dc1" in area "wan"
TestIntentionsMatch_basic - 2019/12/30 18:54:00.409735 [INFO] agent: Started DNS server 127.0.0.1:17915 (tcp)
TestIntentionsMatch_basic - 2019/12/30 18:54:00.409830 [INFO] agent: Started DNS server 127.0.0.1:17915 (udp)
TestIntentionsMatch_basic - 2019/12/30 18:54:00.412145 [INFO] agent: Started HTTP server on 127.0.0.1:17916 (tcp)
TestIntentionsMatch_basic - 2019/12/30 18:54:00.412250 [INFO] agent: started state syncer
2019/12/30 18:54:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:00 [INFO]  raft: Node at 127.0.0.1:17920 [Candidate] entering Candidate state in term 2
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.498153 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.664912 [INFO] manager: shutting down
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.742835 [INFO] agent: consul server down
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.742916 [INFO] agent: shutdown complete
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.742974 [INFO] agent: Stopping DNS server 127.0.0.1:17909 (tcp)
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.743103 [INFO] agent: Stopping DNS server 127.0.0.1:17909 (udp)
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.743259 [INFO] agent: Stopping HTTP server 127.0.0.1:17910 (tcp)
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.743443 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.743513 [INFO] agent: Endpoints down
--- PASS: TestIntentionsMatch_noBy (2.35s)
=== CONT  TestParseToken_ProxyTokenResolve
TestIntentionsMatch_noBy - 2019/12/30 18:54:00.744474 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:00.812919 [WARN] agent: Node name "Node 65a5452b-f16b-4be2-4a42-065b096d8b12" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:00.813534 [DEBUG] tlsutil: Update with version 1
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:00.815974 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:00 [INFO]  raft: Node at 127.0.0.1:17920 [Leader] entering Leader state
TestIntentionsMatch_basic - 2019/12/30 18:54:00.992903 [INFO] consul: cluster leadership acquired
TestIntentionsMatch_basic - 2019/12/30 18:54:00.993338 [INFO] consul: New leader elected: Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f
2019/12/30 18:54:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cebb5d05-17cf-7c88-d10d-373540762901 Address:127.0.0.1:17932}]
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17932 [Follower] entering Follower state (Leader: "")
2019/12/30 18:54:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d4ff3a98-67a8-9b5b-56f9-71c424c8277a Address:127.0.0.1:17926}]
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17926 [Follower] entering Follower state (Leader: "")
TestIntentionsList_values - 2019/12/30 18:54:01.072339 [INFO] serf: EventMemberJoin: Node d4ff3a98-67a8-9b5b-56f9-71c424c8277a.dc1 127.0.0.1
TestIntentionsList_empty - 2019/12/30 18:54:01.078180 [INFO] serf: EventMemberJoin: Node cebb5d05-17cf-7c88-d10d-373540762901.dc1 127.0.0.1
TestIntentionsList_values - 2019/12/30 18:54:01.085724 [INFO] serf: EventMemberJoin: Node d4ff3a98-67a8-9b5b-56f9-71c424c8277a 127.0.0.1
TestIntentionsList_values - 2019/12/30 18:54:01.089723 [INFO] consul: Adding LAN server Node d4ff3a98-67a8-9b5b-56f9-71c424c8277a (Addr: tcp/127.0.0.1:17926) (DC: dc1)
TestIntentionsList_values - 2019/12/30 18:54:01.091518 [INFO] consul: Handled member-join event for server "Node d4ff3a98-67a8-9b5b-56f9-71c424c8277a.dc1" in area "wan"
TestIntentionsList_empty - 2019/12/30 18:54:01.096294 [INFO] serf: EventMemberJoin: Node cebb5d05-17cf-7c88-d10d-373540762901 127.0.0.1
TestIntentionsList_values - 2019/12/30 18:54:01.097282 [INFO] agent: Started DNS server 127.0.0.1:17921 (tcp)
TestIntentionsList_values - 2019/12/30 18:54:01.099858 [INFO] agent: Started DNS server 127.0.0.1:17921 (udp)
TestIntentionsList_values - 2019/12/30 18:54:01.102263 [INFO] agent: Started HTTP server on 127.0.0.1:17922 (tcp)
TestIntentionsList_values - 2019/12/30 18:54:01.102380 [INFO] agent: started state syncer
TestIntentionsList_empty - 2019/12/30 18:54:01.103483 [INFO] consul: Adding LAN server Node cebb5d05-17cf-7c88-d10d-373540762901 (Addr: tcp/127.0.0.1:17932) (DC: dc1)
TestIntentionsList_empty - 2019/12/30 18:54:01.103733 [INFO] consul: Handled member-join event for server "Node cebb5d05-17cf-7c88-d10d-373540762901.dc1" in area "wan"
TestIntentionsList_empty - 2019/12/30 18:54:01.105703 [INFO] agent: Started DNS server 127.0.0.1:17927 (tcp)
TestIntentionsList_empty - 2019/12/30 18:54:01.106214 [INFO] agent: Started DNS server 127.0.0.1:17927 (udp)
TestIntentionsList_empty - 2019/12/30 18:54:01.108903 [INFO] agent: Started HTTP server on 127.0.0.1:17928 (tcp)
TestIntentionsList_empty - 2019/12/30 18:54:01.109004 [INFO] agent: started state syncer
2019/12/30 18:54:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17926 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17932 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17926 [Leader] entering Leader state
TestIntentionsList_values - 2019/12/30 18:54:01.757202 [INFO] consul: cluster leadership acquired
2019/12/30 18:54:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17932 [Leader] entering Leader state
TestIntentionsMatch_basic - 2019/12/30 18:54:01.758896 [INFO] agent: Synced node info
TestIntentionsMatch_basic - 2019/12/30 18:54:01.759035 [DEBUG] agent: Node info in sync
TestIntentionsList_values - 2019/12/30 18:54:01.759767 [INFO] consul: New leader elected: Node d4ff3a98-67a8-9b5b-56f9-71c424c8277a
TestIntentionsList_empty - 2019/12/30 18:54:01.760329 [INFO] consul: cluster leadership acquired
TestIntentionsList_empty - 2019/12/30 18:54:01.760816 [INFO] consul: New leader elected: Node cebb5d05-17cf-7c88-d10d-373540762901
TestIntentionsMatch_basic - 2019/12/30 18:54:01.823743 [DEBUG] agent: Node info in sync
2019/12/30 18:54:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:65a5452b-f16b-4be2-4a42-065b096d8b12 Address:127.0.0.1:17938}]
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17938 [Follower] entering Follower state (Leader: "")
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.880771 [INFO] serf: EventMemberJoin: Node 65a5452b-f16b-4be2-4a42-065b096d8b12.dc1 127.0.0.1
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.884316 [INFO] serf: EventMemberJoin: Node 65a5452b-f16b-4be2-4a42-065b096d8b12 127.0.0.1
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.885787 [INFO] agent: Started DNS server 127.0.0.1:17933 (udp)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.886267 [INFO] consul: Adding LAN server Node 65a5452b-f16b-4be2-4a42-065b096d8b12 (Addr: tcp/127.0.0.1:17938) (DC: dc1)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.886470 [INFO] consul: Handled member-join event for server "Node 65a5452b-f16b-4be2-4a42-065b096d8b12.dc1" in area "wan"
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.887103 [INFO] agent: Started DNS server 127.0.0.1:17933 (tcp)
TestIntentionsList_empty - 2019/12/30 18:54:01.888267 [INFO] agent: Requesting shutdown
TestIntentionsList_empty - 2019/12/30 18:54:01.888350 [INFO] consul: shutting down server
TestIntentionsList_empty - 2019/12/30 18:54:01.888398 [WARN] serf: Shutdown without a Leave
TestIntentionsList_empty - 2019/12/30 18:54:01.888864 [ERR] agent: failed to sync remote state: No cluster leader
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.895283 [INFO] agent: Started HTTP server on 127.0.0.1:17934 (tcp)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:01.895395 [INFO] agent: started state syncer
2019/12/30 18:54:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:01 [INFO]  raft: Node at 127.0.0.1:17938 [Candidate] entering Candidate state in term 2
TestIntentionsList_empty - 2019/12/30 18:54:02.145769 [WARN] serf: Shutdown without a Leave
TestIntentionsList_empty - 2019/12/30 18:54:02.557373 [INFO] manager: shutting down
TestIntentionsList_empty - 2019/12/30 18:54:02.707934 [INFO] agent: consul server down
TestIntentionsList_empty - 2019/12/30 18:54:02.708040 [INFO] agent: shutdown complete
TestIntentionsList_empty - 2019/12/30 18:54:02.708104 [INFO] agent: Stopping DNS server 127.0.0.1:17927 (tcp)
TestIntentionsList_empty - 2019/12/30 18:54:02.708283 [INFO] agent: Stopping DNS server 127.0.0.1:17927 (udp)
TestIntentionsList_empty - 2019/12/30 18:54:02.708467 [INFO] agent: Stopping HTTP server 127.0.0.1:17928 (tcp)
TestIntentionsList_empty - 2019/12/30 18:54:02.708678 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsList_empty - 2019/12/30 18:54:02.708751 [INFO] agent: Endpoints down
--- PASS: TestIntentionsList_empty (2.80s)
=== CONT  TestEnableWebUI
TestIntentionsList_empty - 2019/12/30 18:54:02.708850 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestIntentionsList_values - 2019/12/30 18:54:02.714944 [INFO] agent: Synced node info
TestIntentionsList_values - 2019/12/30 18:54:02.715053 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestEnableWebUI - 2019/12/30 18:54:02.787655 [WARN] agent: Node name "Node b074464a-0d9d-d768-556b-8b33a449c886" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEnableWebUI - 2019/12/30 18:54:02.788091 [DEBUG] tlsutil: Update with version 1
TestEnableWebUI - 2019/12/30 18:54:02.790385 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:03 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:03 [INFO]  raft: Node at 127.0.0.1:17938 [Leader] entering Leader state
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:03.177456 [INFO] consul: cluster leadership acquired
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:03.177906 [INFO] consul: New leader elected: Node 65a5452b-f16b-4be2-4a42-065b096d8b12
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:03.316573 [ERR] agent: failed to sync remote state: ACL not found
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:03.437935 [INFO] acl: initializing acls
TestIntentionsList_values - 2019/12/30 18:54:04.037338 [DEBUG] agent: Node info in sync
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.042992 [INFO] consul: Created ACL 'global-management' policy
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.043067 [WARN] consul: Configuring a non-UUID master token is deprecated
TestIntentionsMatch_basic - 2019/12/30 18:54:04.046274 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.047146 [INFO] acl: initializing acls
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.047259 [WARN] consul: Configuring a non-UUID master token is deprecated
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.322934 [ERR] agent: failed to sync remote state: ACL not found
TestIntentionsList_values - 2019/12/30 18:54:04.368276 [INFO] agent: Requesting shutdown
TestIntentionsList_values - 2019/12/30 18:54:04.368383 [INFO] consul: shutting down server
TestIntentionsList_values - 2019/12/30 18:54:04.368436 [WARN] serf: Shutdown without a Leave
TestIntentionsList_values - 2019/12/30 18:54:04.481640 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b074464a-0d9d-d768-556b-8b33a449c886 Address:127.0.0.1:17944}]
2019/12/30 18:54:04 [INFO]  raft: Node at 127.0.0.1:17944 [Follower] entering Follower state (Leader: "")
TestEnableWebUI - 2019/12/30 18:54:04.487754 [INFO] serf: EventMemberJoin: Node b074464a-0d9d-d768-556b-8b33a449c886.dc1 127.0.0.1
TestEnableWebUI - 2019/12/30 18:54:04.497433 [INFO] serf: EventMemberJoin: Node b074464a-0d9d-d768-556b-8b33a449c886 127.0.0.1
TestEnableWebUI - 2019/12/30 18:54:04.498372 [INFO] consul: Adding LAN server Node b074464a-0d9d-d768-556b-8b33a449c886 (Addr: tcp/127.0.0.1:17944) (DC: dc1)
TestEnableWebUI - 2019/12/30 18:54:04.498949 [INFO] consul: Handled member-join event for server "Node b074464a-0d9d-d768-556b-8b33a449c886.dc1" in area "wan"
TestEnableWebUI - 2019/12/30 18:54:04.501919 [INFO] agent: Started DNS server 127.0.0.1:17939 (tcp)
TestEnableWebUI - 2019/12/30 18:54:04.502043 [INFO] agent: Started DNS server 127.0.0.1:17939 (udp)
TestEnableWebUI - 2019/12/30 18:54:04.506676 [INFO] agent: Started HTTP server on 127.0.0.1:17940 (tcp)
TestEnableWebUI - 2019/12/30 18:54:04.506780 [INFO] agent: started state syncer
2019/12/30 18:54:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:04 [INFO]  raft: Node at 127.0.0.1:17944 [Candidate] entering Candidate state in term 2
TestIntentionsList_values - 2019/12/30 18:54:04.590043 [INFO] manager: shutting down
TestIntentionsList_values - 2019/12/30 18:54:04.595049 [INFO] agent: consul server down
TestIntentionsList_values - 2019/12/30 18:54:04.595120 [INFO] agent: shutdown complete
TestIntentionsList_values - 2019/12/30 18:54:04.595184 [INFO] agent: Stopping DNS server 127.0.0.1:17921 (tcp)
TestIntentionsList_values - 2019/12/30 18:54:04.595350 [INFO] agent: Stopping DNS server 127.0.0.1:17921 (udp)
TestIntentionsList_values - 2019/12/30 18:54:04.595519 [INFO] agent: Stopping HTTP server 127.0.0.1:17922 (tcp)
TestIntentionsList_values - 2019/12/30 18:54:04.595723 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsList_values - 2019/12/30 18:54:04.595793 [INFO] agent: Endpoints down
--- PASS: TestIntentionsList_values (4.83s)
=== CONT  TestACLResolution
TestIntentionsList_values - 2019/12/30 18:54:04.603044 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestACLResolution - 2019/12/30 18:54:04.663697 [WARN] agent: Node name "Node 4e5d7e4b-62a5-f932-2a08-569fdf060c9b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestACLResolution - 2019/12/30 18:54:04.664262 [DEBUG] tlsutil: Update with version 1
TestACLResolution - 2019/12/30 18:54:04.667008 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.707981 [INFO] consul: Bootstrapped ACL master token from configuration
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.707982 [INFO] consul: Bootstrapped ACL master token from configuration
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.929541 [INFO] consul: Created ACL anonymous token from configuration
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.930461 [INFO] serf: EventMemberUpdate: Node 65a5452b-f16b-4be2-4a42-065b096d8b12
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:04.931127 [INFO] serf: EventMemberUpdate: Node 65a5452b-f16b-4be2-4a42-065b096d8b12.dc1
TestIntentionsMatch_basic - 2019/12/30 18:54:05.209206 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestIntentionsMatch_basic - 2019/12/30 18:54:05.209803 [DEBUG] consul: Skipping self join check for "Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f" since the cluster is too small
TestIntentionsMatch_basic - 2019/12/30 18:54:05.209992 [INFO] consul: member 'Node e9bbbb3f-ca96-38b9-c199-84ef0763f27f' joined, marking health alive
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:05.227009 [ERR] leaf watch error: invalid type for leaf response: <nil>
2019/12/30 18:54:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:05 [INFO]  raft: Node at 127.0.0.1:17944 [Leader] entering Leader state
TestEnableWebUI - 2019/12/30 18:54:05.326894 [INFO] consul: cluster leadership acquired
TestEnableWebUI - 2019/12/30 18:54:05.327484 [INFO] consul: New leader elected: Node b074464a-0d9d-d768-556b-8b33a449c886
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:05.445517 [INFO] consul: Created ACL anonymous token from configuration
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:05.445636 [DEBUG] acl: transitioning out of legacy ACL mode
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:05.446506 [INFO] serf: EventMemberUpdate: Node 65a5452b-f16b-4be2-4a42-065b096d8b12
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:05.447204 [INFO] serf: EventMemberUpdate: Node 65a5452b-f16b-4be2-4a42-065b096d8b12.dc1
TestIntentionsMatch_basic - 2019/12/30 18:54:05.653998 [INFO] agent: Requesting shutdown
TestIntentionsMatch_basic - 2019/12/30 18:54:05.654125 [INFO] consul: shutting down server
TestIntentionsMatch_basic - 2019/12/30 18:54:05.654181 [WARN] serf: Shutdown without a Leave
TestEnableWebUI - 2019/12/30 18:54:05.659717 [INFO] agent: Requesting shutdown
TestEnableWebUI - 2019/12/30 18:54:05.659828 [INFO] consul: shutting down server
TestEnableWebUI - 2019/12/30 18:54:05.659878 [WARN] serf: Shutdown without a Leave
TestIntentionsMatch_basic - 2019/12/30 18:54:05.789938 [WARN] serf: Shutdown without a Leave
TestEnableWebUI - 2019/12/30 18:54:05.797096 [WARN] serf: Shutdown without a Leave
TestEnableWebUI - 2019/12/30 18:54:05.801862 [INFO] agent: Synced node info
2019/12/30 18:54:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e5d7e4b-62a5-f932-2a08-569fdf060c9b Address:127.0.0.1:17950}]
2019/12/30 18:54:06 [INFO]  raft: Node at 127.0.0.1:17950 [Follower] entering Follower state (Leader: "")
TestACLResolution - 2019/12/30 18:54:06.369574 [INFO] serf: EventMemberJoin: Node 4e5d7e4b-62a5-f932-2a08-569fdf060c9b.dc1 127.0.0.1
TestACLResolution - 2019/12/30 18:54:06.390678 [INFO] serf: EventMemberJoin: Node 4e5d7e4b-62a5-f932-2a08-569fdf060c9b 127.0.0.1
TestACLResolution - 2019/12/30 18:54:06.391530 [INFO] consul: Adding LAN server Node 4e5d7e4b-62a5-f932-2a08-569fdf060c9b (Addr: tcp/127.0.0.1:17950) (DC: dc1)
TestACLResolution - 2019/12/30 18:54:06.392179 [INFO] consul: Handled member-join event for server "Node 4e5d7e4b-62a5-f932-2a08-569fdf060c9b.dc1" in area "wan"
TestACLResolution - 2019/12/30 18:54:06.392970 [INFO] agent: Started DNS server 127.0.0.1:17945 (tcp)
TestACLResolution - 2019/12/30 18:54:06.393055 [INFO] agent: Started DNS server 127.0.0.1:17945 (udp)
TestACLResolution - 2019/12/30 18:54:06.395401 [INFO] agent: Started HTTP server on 127.0.0.1:17946 (tcp)
TestACLResolution - 2019/12/30 18:54:06.395506 [INFO] agent: started state syncer
2019/12/30 18:54:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:06 [INFO]  raft: Node at 127.0.0.1:17950 [Candidate] entering Candidate state in term 2
TestIntentionsMatch_basic - 2019/12/30 18:54:06.790792 [INFO] manager: shutting down
TestIntentionsMatch_basic - 2019/12/30 18:54:06.792705 [INFO] agent: consul server down
TestIntentionsMatch_basic - 2019/12/30 18:54:06.792825 [INFO] agent: shutdown complete
TestIntentionsMatch_basic - 2019/12/30 18:54:06.793002 [INFO] agent: Stopping DNS server 127.0.0.1:17915 (tcp)
TestIntentionsMatch_basic - 2019/12/30 18:54:06.793354 [INFO] agent: Stopping DNS server 127.0.0.1:17915 (udp)
TestIntentionsMatch_basic - 2019/12/30 18:54:06.793713 [INFO] agent: Stopping HTTP server 127.0.0.1:17916 (tcp)
TestIntentionsMatch_basic - 2019/12/30 18:54:06.794226 [INFO] agent: Waiting for endpoints to shut down
TestIntentionsMatch_basic - 2019/12/30 18:54:06.794366 [INFO] agent: Endpoints down
--- PASS: TestIntentionsMatch_basic (7.66s)
=== CONT  TestParseConsistency_Invalid
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:06.821241 [INFO] agent: Synced service "test-id"
TestEnableWebUI - 2019/12/30 18:54:06.842763 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestParseConsistency_Invalid - 2019/12/30 18:54:06.930739 [WARN] agent: Node name "Node 1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseConsistency_Invalid - 2019/12/30 18:54:06.931307 [DEBUG] tlsutil: Update with version 1
TestParseConsistency_Invalid - 2019/12/30 18:54:06.933690 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEnableWebUI - 2019/12/30 18:54:07.428048 [INFO] agent: consul server down
TestEnableWebUI - 2019/12/30 18:54:07.428121 [INFO] agent: shutdown complete
TestEnableWebUI - 2019/12/30 18:54:07.428188 [INFO] agent: Stopping DNS server 127.0.0.1:17939 (tcp)
TestEnableWebUI - 2019/12/30 18:54:07.428225 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestEnableWebUI - 2019/12/30 18:54:07.428340 [INFO] agent: Stopping DNS server 127.0.0.1:17939 (udp)
TestEnableWebUI - 2019/12/30 18:54:07.428510 [INFO] agent: Stopping HTTP server 127.0.0.1:17940 (tcp)
TestEnableWebUI - 2019/12/30 18:54:07.428531 [ERR] consul: failed to establish leadership: raft is already shutdown
TestEnableWebUI - 2019/12/30 18:54:07.428734 [INFO] agent: Waiting for endpoints to shut down
TestEnableWebUI - 2019/12/30 18:54:07.428765 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestEnableWebUI - 2019/12/30 18:54:07.428802 [INFO] agent: Endpoints down
--- PASS: TestEnableWebUI (4.72s)
=== CONT  TestParseConsistency
TestEnableWebUI - 2019/12/30 18:54:07.428825 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestEnableWebUI - 2019/12/30 18:54:07.430364 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestEnableWebUI - 2019/12/30 18:54:07.430422 [ERR] consul: failed to transfer leadership in 3 attempts
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.536576 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestParseConsistency - 2019/12/30 18:54:07.686567 [WARN] agent: Node name "Node 4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseConsistency - 2019/12/30 18:54:07.687012 [DEBUG] tlsutil: Update with version 1
TestParseConsistency - 2019/12/30 18:54:07.689261 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.792731 [INFO] agent: Synced service "test-id-proxy"
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.792844 [DEBUG] agent: Check "service:test-id" in sync
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.792901 [DEBUG] agent: Check "service:test-id-proxy" in sync
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.792931 [DEBUG] agent: Node info in sync
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/acl/info/root)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/self)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/metrics)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/services)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/checks)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/members)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/connect/ca/roots)
=== RUN   TestParseToken_ProxyTokenResolve/GET(/v1/agent/connect/ca/leaf/test)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.811109 [INFO] agent: Requesting shutdown
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.811256 [INFO] consul: shutting down server
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.811308 [WARN] serf: Shutdown without a Leave
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:07.974901 [WARN] serf: Shutdown without a Leave
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.073336 [INFO] manager: shutting down
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.074618 [INFO] agent: consul server down
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.074691 [INFO] agent: shutdown complete
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.074751 [INFO] agent: Stopping DNS server 127.0.0.1:17933 (tcp)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.074919 [INFO] agent: Stopping DNS server 127.0.0.1:17933 (udp)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.075114 [INFO] agent: Stopping HTTP server 127.0.0.1:17934 (tcp)
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.075365 [INFO] agent: Waiting for endpoints to shut down
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.075446 [INFO] agent: Endpoints down
--- FAIL: TestParseToken_ProxyTokenResolve (7.33s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/acl/info/root) (0.00s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/agent/self) (0.00s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/agent/metrics) (0.00s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/agent/services) (0.00s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/agent/checks) (0.00s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/agent/members) (0.00s)
    --- PASS: TestParseToken_ProxyTokenResolve/GET(/v1/agent/connect/ca/roots) (0.00s)
    --- FAIL: TestParseToken_ProxyTokenResolve/GET(/v1/agent/connect/ca/leaf/test) (0.00s)
        http_test.go:1221: 
            	Error Trace:	http_test.go:1221
            	Error:      	Received unexpected error:
            	            	cluster has no CA bootstrapped yet
            	Test:       	TestParseToken_ProxyTokenResolve/GET(/v1/agent/connect/ca/leaf/test)
=== CONT  TestParseWait_InvalidIndex
--- PASS: TestParseWait_InvalidIndex (0.00s)
=== CONT  TestParseWait_InvalidTime
--- PASS: TestParseWait_InvalidTime (0.00s)
=== CONT  TestPProfHandlers_ACLs
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.078714 [ERR] connect: Apply failed raft is already shutdown
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.078786 [ERR] consul: failed to establish leadership: raft is already shutdown
TestParseToken_ProxyTokenResolve - 2019/12/30 18:54:08.079043 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
2019/12/30 18:54:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:08 [INFO]  raft: Node at 127.0.0.1:17950 [Leader] entering Leader state
TestACLResolution - 2019/12/30 18:54:08.081190 [INFO] consul: cluster leadership acquired
TestACLResolution - 2019/12/30 18:54:08.081617 [INFO] consul: New leader elected: Node 4e5d7e4b-62a5-f932-2a08-569fdf060c9b
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestPProfHandlers_ACLs - 2019/12/30 18:54:08.242688 [WARN] agent: Node name "Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPProfHandlers_ACLs - 2019/12/30 18:54:08.243241 [DEBUG] tlsutil: Update with version 1
TestPProfHandlers_ACLs - 2019/12/30 18:54:08.245529 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestACLResolution - 2019/12/30 18:54:08.282175 [INFO] agent: Requesting shutdown
TestACLResolution - 2019/12/30 18:54:08.282285 [INFO] consul: shutting down server
TestACLResolution - 2019/12/30 18:54:08.282351 [WARN] serf: Shutdown without a Leave
TestACLResolution - 2019/12/30 18:54:08.282439 [ERR] agent: failed to sync remote state: No cluster leader
TestACLResolution - 2019/12/30 18:54:08.385867 [WARN] serf: Shutdown without a Leave
TestACLResolution - 2019/12/30 18:54:08.484194 [INFO] manager: shutting down
2019/12/30 18:54:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f Address:127.0.0.1:17956}]
2019/12/30 18:54:08 [INFO]  raft: Node at 127.0.0.1:17956 [Follower] entering Follower state (Leader: "")
TestACLResolution - 2019/12/30 18:54:08.605883 [INFO] agent: consul server down
TestACLResolution - 2019/12/30 18:54:08.605941 [INFO] agent: shutdown complete
TestACLResolution - 2019/12/30 18:54:08.605993 [INFO] agent: Stopping DNS server 127.0.0.1:17945 (tcp)
TestACLResolution - 2019/12/30 18:54:08.606144 [INFO] agent: Stopping DNS server 127.0.0.1:17945 (udp)
TestACLResolution - 2019/12/30 18:54:08.606317 [INFO] agent: Stopping HTTP server 127.0.0.1:17946 (tcp)
TestACLResolution - 2019/12/30 18:54:08.606992 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestACLResolution - 2019/12/30 18:54:08.609337 [INFO] agent: Waiting for endpoints to shut down
TestACLResolution - 2019/12/30 18:54:08.609480 [INFO] agent: Endpoints down
--- PASS: TestACLResolution (4.01s)
=== CONT  TestPProfHandlers_EnableDebug
TestParseConsistency_Invalid - 2019/12/30 18:54:08.617017 [INFO] serf: EventMemberJoin: Node 1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f.dc1 127.0.0.1
TestParseConsistency_Invalid - 2019/12/30 18:54:08.620271 [INFO] serf: EventMemberJoin: Node 1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f 127.0.0.1
TestParseConsistency_Invalid - 2019/12/30 18:54:08.621038 [INFO] consul: Handled member-join event for server "Node 1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f.dc1" in area "wan"
TestParseConsistency_Invalid - 2019/12/30 18:54:08.621327 [INFO] consul: Adding LAN server Node 1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f (Addr: tcp/127.0.0.1:17956) (DC: dc1)
TestParseConsistency_Invalid - 2019/12/30 18:54:08.621495 [INFO] agent: Started DNS server 127.0.0.1:17951 (udp)
TestParseConsistency_Invalid - 2019/12/30 18:54:08.621809 [INFO] agent: Started DNS server 127.0.0.1:17951 (tcp)
TestParseConsistency_Invalid - 2019/12/30 18:54:08.624107 [INFO] agent: Started HTTP server on 127.0.0.1:17952 (tcp)
TestParseConsistency_Invalid - 2019/12/30 18:54:08.624227 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:08.670946 [WARN] agent: Node name "Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:08.671367 [DEBUG] tlsutil: Update with version 1
2019/12/30 18:54:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:08 [INFO]  raft: Node at 127.0.0.1:17956 [Candidate] entering Candidate state in term 2
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:08.673785 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2 Address:127.0.0.1:17962}]
2019/12/30 18:54:08 [INFO]  raft: Node at 127.0.0.1:17962 [Follower] entering Follower state (Leader: "")
TestParseConsistency - 2019/12/30 18:54:08.941702 [INFO] serf: EventMemberJoin: Node 4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2.dc1 127.0.0.1
TestParseConsistency - 2019/12/30 18:54:08.945506 [INFO] serf: EventMemberJoin: Node 4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2 127.0.0.1
TestParseConsistency - 2019/12/30 18:54:08.946433 [INFO] consul: Adding LAN server Node 4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2 (Addr: tcp/127.0.0.1:17962) (DC: dc1)
TestParseConsistency - 2019/12/30 18:54:08.946756 [INFO] consul: Handled member-join event for server "Node 4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2.dc1" in area "wan"
TestParseConsistency - 2019/12/30 18:54:08.946843 [INFO] agent: Started DNS server 127.0.0.1:17957 (udp)
TestParseConsistency - 2019/12/30 18:54:08.947221 [INFO] agent: Started DNS server 127.0.0.1:17957 (tcp)
TestParseConsistency - 2019/12/30 18:54:08.949515 [INFO] agent: Started HTTP server on 127.0.0.1:17958 (tcp)
TestParseConsistency - 2019/12/30 18:54:08.949623 [INFO] agent: started state syncer
2019/12/30 18:54:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:08 [INFO]  raft: Node at 127.0.0.1:17962 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:09 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:09 [INFO]  raft: Node at 127.0.0.1:17956 [Leader] entering Leader state
TestParseConsistency_Invalid - 2019/12/30 18:54:09.434945 [INFO] consul: cluster leadership acquired
TestParseConsistency_Invalid - 2019/12/30 18:54:09.436851 [INFO] consul: New leader elected: Node 1e2f0070-ee19-07ea-e8fd-2c29efe6fd0f
TestParseConsistency_Invalid - 2019/12/30 18:54:09.439033 [INFO] agent: Requesting shutdown
TestParseConsistency_Invalid - 2019/12/30 18:54:09.439334 [INFO] consul: shutting down server
TestParseConsistency_Invalid - 2019/12/30 18:54:09.439657 [WARN] serf: Shutdown without a Leave
TestParseConsistency_Invalid - 2019/12/30 18:54:09.440424 [ERR] agent: failed to sync remote state: No cluster leader
TestParseConsistency_Invalid - 2019/12/30 18:54:09.540058 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5d224d96-2a5a-ccbf-3d34-579d2575af9b Address:127.0.0.1:17968}]
2019/12/30 18:54:09 [INFO]  raft: Node at 127.0.0.1:17968 [Follower] entering Follower state (Leader: "")
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.547489 [INFO] serf: EventMemberJoin: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b.dc1 127.0.0.1
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.555199 [INFO] serf: EventMemberJoin: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b 127.0.0.1
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.556308 [INFO] consul: Adding LAN server Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b (Addr: tcp/127.0.0.1:17968) (DC: dc1)
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.557123 [INFO] consul: Handled member-join event for server "Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b.dc1" in area "wan"
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.557956 [INFO] agent: Started DNS server 127.0.0.1:17963 (udp)
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.558404 [INFO] agent: Started DNS server 127.0.0.1:17963 (tcp)
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.560893 [INFO] agent: Started HTTP server on 127.0.0.1:17964 (tcp)
TestPProfHandlers_ACLs - 2019/12/30 18:54:09.560993 [INFO] agent: started state syncer
2019/12/30 18:54:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:09 [INFO]  raft: Node at 127.0.0.1:17968 [Candidate] entering Candidate state in term 2
TestParseConsistency_Invalid - 2019/12/30 18:54:09.687988 [INFO] manager: shutting down
2019/12/30 18:54:09 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:09 [INFO]  raft: Node at 127.0.0.1:17962 [Leader] entering Leader state
TestParseConsistency - 2019/12/30 18:54:09.942879 [INFO] consul: cluster leadership acquired
TestParseConsistency - 2019/12/30 18:54:09.943370 [INFO] consul: New leader elected: Node 4bcf70bf-d3a0-2b9e-8f69-ee9969e250a2
TestParseConsistency - 2019/12/30 18:54:10.008423 [INFO] agent: Requesting shutdown
TestParseConsistency - 2019/12/30 18:54:10.008548 [INFO] consul: shutting down server
TestParseConsistency - 2019/12/30 18:54:10.008597 [WARN] serf: Shutdown without a Leave
TestParseConsistency - 2019/12/30 18:54:10.008942 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:54:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8 Address:127.0.0.1:17974}]
TestParseConsistency_Invalid - 2019/12/30 18:54:10.030450 [INFO] agent: consul server down
TestParseConsistency_Invalid - 2019/12/30 18:54:10.030527 [INFO] agent: shutdown complete
TestParseConsistency_Invalid - 2019/12/30 18:54:10.030601 [INFO] agent: Stopping DNS server 127.0.0.1:17951 (tcp)
TestParseConsistency_Invalid - 2019/12/30 18:54:10.030739 [INFO] agent: Stopping DNS server 127.0.0.1:17951 (udp)
TestParseConsistency_Invalid - 2019/12/30 18:54:10.030892 [INFO] agent: Stopping HTTP server 127.0.0.1:17952 (tcp)
TestParseConsistency_Invalid - 2019/12/30 18:54:10.031117 [INFO] agent: Waiting for endpoints to shut down
TestParseConsistency_Invalid - 2019/12/30 18:54:10.031195 [INFO] agent: Endpoints down
--- PASS: TestParseConsistency_Invalid (3.24s)
=== CONT  TestParseWait
--- PASS: TestParseWait (0.00s)
=== CONT  TestParseSource
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.035412 [INFO] serf: EventMemberJoin: Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8.dc1 127.0.0.1
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.038820 [INFO] serf: EventMemberJoin: Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8 127.0.0.1
2019/12/30 18:54:10 [INFO]  raft: Node at 127.0.0.1:17974 [Follower] entering Follower state (Leader: "")
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.040661 [INFO] consul: Adding LAN server Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8 (Addr: tcp/127.0.0.1:17974) (DC: dc1)
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.040874 [INFO] consul: Handled member-join event for server "Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8.dc1" in area "wan"
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.041311 [INFO] agent: Started DNS server 127.0.0.1:17969 (udp)
TestParseConsistency_Invalid - 2019/12/30 18:54:10.041531 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.041712 [INFO] agent: Started DNS server 127.0.0.1:17969 (tcp)
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.044044 [INFO] agent: Started HTTP server on 127.0.0.1:17970 (tcp)
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.044264 [INFO] agent: started state syncer
2019/12/30 18:54:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:10 [INFO]  raft: Node at 127.0.0.1:17974 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestParseSource - 2019/12/30 18:54:10.107295 [WARN] agent: Node name "Node d3268261-99a6-5d20-e65f-16fadd14fae4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseSource - 2019/12/30 18:54:10.107749 [DEBUG] tlsutil: Update with version 1
TestParseSource - 2019/12/30 18:54:10.109932 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestParseConsistency - 2019/12/30 18:54:10.131903 [WARN] serf: Shutdown without a Leave
TestParseConsistency - 2019/12/30 18:54:10.228663 [INFO] manager: shutting down
2019/12/30 18:54:10 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:10 [INFO]  raft: Node at 127.0.0.1:17968 [Leader] entering Leader state
TestParseConsistency - 2019/12/30 18:54:10.498778 [INFO] agent: consul server down
TestParseConsistency - 2019/12/30 18:54:10.498847 [INFO] agent: shutdown complete
TestParseConsistency - 2019/12/30 18:54:10.498913 [INFO] agent: Stopping DNS server 127.0.0.1:17957 (tcp)
TestParseConsistency - 2019/12/30 18:54:10.499118 [INFO] agent: Stopping DNS server 127.0.0.1:17957 (udp)
TestParseConsistency - 2019/12/30 18:54:10.499337 [INFO] agent: Stopping HTTP server 127.0.0.1:17958 (tcp)
TestParseConsistency - 2019/12/30 18:54:10.499720 [INFO] agent: Waiting for endpoints to shut down
TestParseConsistency - 2019/12/30 18:54:10.499840 [INFO] agent: Endpoints down
--- PASS: TestParseConsistency (3.07s)
=== CONT  TestPrettyPrintBare
TestParseConsistency - 2019/12/30 18:54:10.504846 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestPProfHandlers_ACLs - 2019/12/30 18:54:10.505011 [INFO] consul: cluster leadership acquired
TestPProfHandlers_ACLs - 2019/12/30 18:54:10.505437 [INFO] consul: New leader elected: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b
WARNING: bootstrap = true: do not enable unless necessary
TestPrettyPrintBare - 2019/12/30 18:54:10.556191 [WARN] agent: Node name "Node 630e98f0-3ea6-0ff5-b8ee-fad458107547" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPrettyPrintBare - 2019/12/30 18:54:10.556731 [DEBUG] tlsutil: Update with version 1
TestPrettyPrintBare - 2019/12/30 18:54:10.558838 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:10 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:10 [INFO]  raft: Node at 127.0.0.1:17974 [Leader] entering Leader state
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.675576 [INFO] consul: cluster leadership acquired
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:10.676025 [INFO] consul: New leader elected: Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8
TestPProfHandlers_ACLs - 2019/12/30 18:54:10.735017 [ERR] agent: failed to sync remote state: ACL not found
TestPProfHandlers_ACLs - 2019/12/30 18:54:10.876242 [INFO] acl: initializing acls
jones - 2019/12/30 18:54:11.095334 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:11.095415 [DEBUG] agent: Node info in sync
TestPProfHandlers_ACLs - 2019/12/30 18:54:11.100449 [INFO] consul: Created ACL 'global-management' policy
TestPProfHandlers_ACLs - 2019/12/30 18:54:11.100533 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPProfHandlers_ACLs - 2019/12/30 18:54:11.113358 [INFO] acl: initializing acls
TestPProfHandlers_ACLs - 2019/12/30 18:54:11.113592 [WARN] consul: Configuring a non-UUID master token is deprecated
2019/12/30 18:54:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d3268261-99a6-5d20-e65f-16fadd14fae4 Address:127.0.0.1:17980}]
2019/12/30 18:54:11 [INFO]  raft: Node at 127.0.0.1:17980 [Follower] entering Follower state (Leader: "")
TestParseSource - 2019/12/30 18:54:11.199499 [INFO] serf: EventMemberJoin: Node d3268261-99a6-5d20-e65f-16fadd14fae4.dc1 127.0.0.1
TestParseSource - 2019/12/30 18:54:11.206587 [INFO] serf: EventMemberJoin: Node d3268261-99a6-5d20-e65f-16fadd14fae4 127.0.0.1
TestParseSource - 2019/12/30 18:54:11.207746 [INFO] consul: Adding LAN server Node d3268261-99a6-5d20-e65f-16fadd14fae4 (Addr: tcp/127.0.0.1:17980) (DC: dc1)
TestParseSource - 2019/12/30 18:54:11.207990 [INFO] consul: Handled member-join event for server "Node d3268261-99a6-5d20-e65f-16fadd14fae4.dc1" in area "wan"
TestParseSource - 2019/12/30 18:54:11.209189 [INFO] agent: Started DNS server 127.0.0.1:17975 (tcp)
TestParseSource - 2019/12/30 18:54:11.209257 [INFO] agent: Started DNS server 127.0.0.1:17975 (udp)
TestParseSource - 2019/12/30 18:54:11.211588 [INFO] agent: Started HTTP server on 127.0.0.1:17976 (tcp)
TestParseSource - 2019/12/30 18:54:11.211681 [INFO] agent: started state syncer
2019/12/30 18:54:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:11 [INFO]  raft: Node at 127.0.0.1:17980 [Candidate] entering Candidate state in term 2
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:11.317407 [INFO] agent: Synced node info
2019/12/30 18:54:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:630e98f0-3ea6-0ff5-b8ee-fad458107547 Address:127.0.0.1:17986}]
TestPProfHandlers_ACLs - 2019/12/30 18:54:11.608332 [INFO] consul: Bootstrapped ACL master token from configuration
TestPrettyPrintBare - 2019/12/30 18:54:11.611205 [INFO] serf: EventMemberJoin: Node 630e98f0-3ea6-0ff5-b8ee-fad458107547.dc1 127.0.0.1
2019/12/30 18:54:11 [INFO]  raft: Node at 127.0.0.1:17986 [Follower] entering Follower state (Leader: "")
TestPProfHandlers_ACLs - 2019/12/30 18:54:11.613063 [INFO] consul: Bootstrapped ACL master token from configuration
TestPrettyPrintBare - 2019/12/30 18:54:11.615169 [INFO] serf: EventMemberJoin: Node 630e98f0-3ea6-0ff5-b8ee-fad458107547 127.0.0.1
TestPrettyPrintBare - 2019/12/30 18:54:11.616861 [INFO] consul: Handled member-join event for server "Node 630e98f0-3ea6-0ff5-b8ee-fad458107547.dc1" in area "wan"
TestPrettyPrintBare - 2019/12/30 18:54:11.617249 [INFO] consul: Adding LAN server Node 630e98f0-3ea6-0ff5-b8ee-fad458107547 (Addr: tcp/127.0.0.1:17986) (DC: dc1)
TestPrettyPrintBare - 2019/12/30 18:54:11.617542 [INFO] agent: Started DNS server 127.0.0.1:17981 (udp)
TestPrettyPrintBare - 2019/12/30 18:54:11.617914 [INFO] agent: Started DNS server 127.0.0.1:17981 (tcp)
TestPrettyPrintBare - 2019/12/30 18:54:11.621116 [INFO] agent: Started HTTP server on 127.0.0.1:17982 (tcp)
TestPrettyPrintBare - 2019/12/30 18:54:11.625871 [INFO] agent: started state syncer
2019/12/30 18:54:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:11 [INFO]  raft: Node at 127.0.0.1:17986 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:12 [INFO]  raft: Node at 127.0.0.1:17980 [Leader] entering Leader state
TestParseSource - 2019/12/30 18:54:12.034546 [INFO] consul: cluster leadership acquired
TestParseSource - 2019/12/30 18:54:12.034977 [INFO] consul: New leader elected: Node d3268261-99a6-5d20-e65f-16fadd14fae4
TestParseSource - 2019/12/30 18:54:12.110541 [INFO] agent: Requesting shutdown
TestParseSource - 2019/12/30 18:54:12.110632 [INFO] consul: shutting down server
TestParseSource - 2019/12/30 18:54:12.110679 [WARN] serf: Shutdown without a Leave
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.142153 [INFO] consul: Created ACL anonymous token from configuration
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.143055 [INFO] serf: EventMemberUpdate: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.143677 [INFO] serf: EventMemberUpdate: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b.dc1
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.145483 [INFO] consul: Created ACL anonymous token from configuration
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.145563 [DEBUG] acl: transitioning out of legacy ACL mode
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.146381 [INFO] serf: EventMemberUpdate: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.147067 [INFO] serf: EventMemberUpdate: Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b.dc1
TestParseSource - 2019/12/30 18:54:12.266063 [WARN] serf: Shutdown without a Leave
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:12.345354 [DEBUG] agent: Node info in sync
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:12.345479 [DEBUG] agent: Node info in sync
2019/12/30 18:54:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:12 [INFO]  raft: Node at 127.0.0.1:17986 [Leader] entering Leader state
TestParseSource - 2019/12/30 18:54:12.369595 [INFO] manager: shutting down
TestPrettyPrintBare - 2019/12/30 18:54:12.371089 [INFO] consul: cluster leadership acquired
TestPrettyPrintBare - 2019/12/30 18:54:12.371467 [INFO] consul: New leader elected: Node 630e98f0-3ea6-0ff5-b8ee-fad458107547
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:12.467100 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:12.467633 [DEBUG] consul: Skipping self join check for "Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8" since the cluster is too small
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:12.467831 [INFO] consul: member 'Node 22139a2e-96b0-cfa0-5a8c-75d21e8c7ef8' joined, marking health alive
TestParseSource - 2019/12/30 18:54:12.468087 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestParseSource - 2019/12/30 18:54:12.468206 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestParseSource - 2019/12/30 18:54:12.468260 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestParseSource - 2019/12/30 18:54:12.468212 [INFO] agent: consul server down
TestParseSource - 2019/12/30 18:54:12.468428 [INFO] agent: shutdown complete
TestParseSource - 2019/12/30 18:54:12.468482 [INFO] agent: Stopping DNS server 127.0.0.1:17975 (tcp)
TestParseSource - 2019/12/30 18:54:12.468669 [INFO] agent: Stopping DNS server 127.0.0.1:17975 (udp)
TestParseSource - 2019/12/30 18:54:12.468864 [INFO] agent: Stopping HTTP server 127.0.0.1:17976 (tcp)
TestParseSource - 2019/12/30 18:54:12.469081 [INFO] agent: Waiting for endpoints to shut down
TestParseSource - 2019/12/30 18:54:12.469152 [INFO] agent: Endpoints down
--- PASS: TestParseSource (2.44s)
=== CONT  TestPrettyPrint
WARNING: bootstrap = true: do not enable unless necessary
TestPrettyPrint - 2019/12/30 18:54:12.573848 [WARN] agent: Node name "Node eaaa14f3-dfe0-923b-ab7c-342a729a293d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPrettyPrint - 2019/12/30 18:54:12.574362 [DEBUG] tlsutil: Update with version 1
TestPrettyPrint - 2019/12/30 18:54:12.576530 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:54:12.584668 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:12.584763 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/30 18:54:12.584802 [DEBUG] agent: Node info in sync
TestPProfHandlers_ACLs - 2019/12/30 18:54:12.616491 [ERR] agent: failed to sync remote state: ACL not found
TestPrettyPrintBare - 2019/12/30 18:54:12.777129 [DEBUG] http: Request GET /v1/kv/key?pretty (334.342µs) from=
TestPrettyPrintBare - 2019/12/30 18:54:12.777318 [INFO] agent: Requesting shutdown
TestPrettyPrintBare - 2019/12/30 18:54:12.777398 [INFO] consul: shutting down server
TestPrettyPrintBare - 2019/12/30 18:54:12.777444 [WARN] serf: Shutdown without a Leave
TestPrettyPrintBare - 2019/12/30 18:54:12.785423 [INFO] agent: Synced node info
TestPrettyPrintBare - 2019/12/30 18:54:12.875535 [WARN] serf: Shutdown without a Leave
TestPrettyPrintBare - 2019/12/30 18:54:12.949787 [INFO] manager: shutting down
TestPrettyPrintBare - 2019/12/30 18:54:13.040416 [INFO] agent: consul server down
TestPrettyPrintBare - 2019/12/30 18:54:13.040503 [INFO] agent: shutdown complete
TestPrettyPrintBare - 2019/12/30 18:54:13.040566 [INFO] agent: Stopping DNS server 127.0.0.1:17981 (tcp)
TestPrettyPrintBare - 2019/12/30 18:54:13.040717 [INFO] agent: Stopping DNS server 127.0.0.1:17981 (udp)
TestPrettyPrintBare - 2019/12/30 18:54:13.040884 [INFO] agent: Stopping HTTP server 127.0.0.1:17982 (tcp)
TestPrettyPrintBare - 2019/12/30 18:54:13.041096 [INFO] agent: Waiting for endpoints to shut down
TestPrettyPrintBare - 2019/12/30 18:54:13.041169 [INFO] agent: Endpoints down
--- PASS: TestPrettyPrintBare (2.54s)
=== CONT  TestHTTP_wrap_obfuscateLog
TestPrettyPrintBare - 2019/12/30 18:54:13.044649 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestPrettyPrintBare - 2019/12/30 18:54:13.044966 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.340895 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.341509 [DEBUG] consul: Skipping self join check for "Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b" since the cluster is too small
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.341630 [INFO] consul: member 'Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b' joined, marking health alive
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:13.503185 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.597080 [DEBUG] consul: Skipping self join check for "Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b" since the cluster is too small
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.614762 [DEBUG] consul: Skipping self join check for "Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b" since the cluster is too small
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.610305 [DEBUG] consul: dropping node "Node 5d224d96-2a5a-ccbf-3d34-579d2575af9b" from result due to ACLs
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.615948 [INFO] agent: Requesting shutdown
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.616030 [INFO] consul: shutting down server
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.616081 [WARN] serf: Shutdown without a Leave
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.740059 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eaaa14f3-dfe0-923b-ab7c-342a729a293d Address:127.0.0.1:17992}]
2019/12/30 18:54:13 [INFO]  raft: Node at 127.0.0.1:17992 [Follower] entering Follower state (Leader: "")
TestPrettyPrint - 2019/12/30 18:54:13.819084 [INFO] serf: EventMemberJoin: Node eaaa14f3-dfe0-923b-ab7c-342a729a293d.dc1 127.0.0.1
TestPrettyPrint - 2019/12/30 18:54:13.829372 [INFO] serf: EventMemberJoin: Node eaaa14f3-dfe0-923b-ab7c-342a729a293d 127.0.0.1
TestPrettyPrint - 2019/12/30 18:54:13.837736 [INFO] consul: Adding LAN server Node eaaa14f3-dfe0-923b-ab7c-342a729a293d (Addr: tcp/127.0.0.1:17992) (DC: dc1)
TestPrettyPrint - 2019/12/30 18:54:13.838079 [INFO] consul: Handled member-join event for server "Node eaaa14f3-dfe0-923b-ab7c-342a729a293d.dc1" in area "wan"
TestPrettyPrint - 2019/12/30 18:54:13.839458 [INFO] agent: Started DNS server 127.0.0.1:17987 (udp)
TestPrettyPrint - 2019/12/30 18:54:13.839619 [INFO] agent: Started DNS server 127.0.0.1:17987 (tcp)
TestPrettyPrint - 2019/12/30 18:54:13.841892 [INFO] agent: Started HTTP server on 127.0.0.1:17988 (tcp)
TestPrettyPrint - 2019/12/30 18:54:13.841986 [INFO] agent: started state syncer
2019/12/30 18:54:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:13 [INFO]  raft: Node at 127.0.0.1:17992 [Candidate] entering Candidate state in term 2
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.881956 [INFO] manager: shutting down
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.882353 [INFO] agent: consul server down
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.882407 [INFO] agent: shutdown complete
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.882462 [INFO] agent: Stopping DNS server 127.0.0.1:17963 (tcp)
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.882628 [INFO] agent: Stopping DNS server 127.0.0.1:17963 (udp)
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.882795 [INFO] agent: Stopping HTTP server 127.0.0.1:17964 (tcp)
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.882998 [INFO] agent: Waiting for endpoints to shut down
TestPProfHandlers_ACLs - 2019/12/30 18:54:13.883044 [INFO] agent: Endpoints down
--- PASS: TestPProfHandlers_ACLs (5.81s)
=== CONT  TestContentTypeIsJSON
WARNING: bootstrap = true: do not enable unless necessary
TestContentTypeIsJSON - 2019/12/30 18:54:14.020116 [WARN] agent: Node name "Node f12f61e5-5947-915c-5cdf-3c7afa4b2a04" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestContentTypeIsJSON - 2019/12/30 18:54:14.020733 [DEBUG] tlsutil: Update with version 1
TestContentTypeIsJSON - 2019/12/30 18:54:14.023265 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:14 [INFO]  raft: Node at 127.0.0.1:17992 [Leader] entering Leader state
TestPrettyPrint - 2019/12/30 18:54:14.416693 [INFO] consul: cluster leadership acquired
TestPrettyPrint - 2019/12/30 18:54:14.417151 [INFO] consul: New leader elected: Node eaaa14f3-dfe0-923b-ab7c-342a729a293d
TestPrettyPrint - 2019/12/30 18:54:14.554776 [DEBUG] http: Request GET /v1/kv/key?pretty=1 (145.004µs) from=
TestPrettyPrint - 2019/12/30 18:54:14.554977 [INFO] agent: Requesting shutdown
TestPrettyPrint - 2019/12/30 18:54:14.555046 [INFO] consul: shutting down server
TestPrettyPrint - 2019/12/30 18:54:14.555093 [WARN] serf: Shutdown without a Leave
TestPrettyPrint - 2019/12/30 18:54:14.706820 [WARN] serf: Shutdown without a Leave
TestPrettyPrint - 2019/12/30 18:54:14.798665 [INFO] manager: shutting down
TestPrettyPrint - 2019/12/30 18:54:14.915398 [INFO] agent: consul server down
TestPrettyPrint - 2019/12/30 18:54:14.915475 [INFO] agent: shutdown complete
TestPrettyPrint - 2019/12/30 18:54:14.915549 [INFO] agent: Stopping DNS server 127.0.0.1:17987 (tcp)
TestPrettyPrint - 2019/12/30 18:54:14.915712 [INFO] agent: Stopping DNS server 127.0.0.1:17987 (udp)
TestPrettyPrint - 2019/12/30 18:54:14.915896 [INFO] agent: Stopping HTTP server 127.0.0.1:17988 (tcp)
TestPrettyPrint - 2019/12/30 18:54:14.916234 [INFO] agent: Waiting for endpoints to shut down
TestPrettyPrint - 2019/12/30 18:54:14.916326 [INFO] agent: Endpoints down
TestPrettyPrint - 2019/12/30 18:54:14.916380 [ERR] consul: failed to wait for barrier: leadership lost while committing log
--- PASS: TestPrettyPrint (2.45s)
=== CONT  TestHTTPAPIResponseHeaders
TestPrettyPrint - 2019/12/30 18:54:14.916496 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestPrettyPrint - 2019/12/30 18:54:14.916547 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:14.985840 [WARN] agent: Node name "Node 5fdf5af0-c7f2-ced6-d605-fe9ac3292376" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:14.986295 [DEBUG] tlsutil: Update with version 1
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:14.989194 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f12f61e5-5947-915c-5cdf-3c7afa4b2a04 Address:127.0.0.1:18004}]
2019/12/30 18:54:15 [INFO]  raft: Node at 127.0.0.1:18004 [Follower] entering Follower state (Leader: "")
TestContentTypeIsJSON - 2019/12/30 18:54:15.120156 [INFO] serf: EventMemberJoin: Node f12f61e5-5947-915c-5cdf-3c7afa4b2a04.dc1 127.0.0.1
TestContentTypeIsJSON - 2019/12/30 18:54:15.124242 [INFO] serf: EventMemberJoin: Node f12f61e5-5947-915c-5cdf-3c7afa4b2a04 127.0.0.1
TestContentTypeIsJSON - 2019/12/30 18:54:15.125808 [INFO] consul: Adding LAN server Node f12f61e5-5947-915c-5cdf-3c7afa4b2a04 (Addr: tcp/127.0.0.1:18004) (DC: dc1)
TestContentTypeIsJSON - 2019/12/30 18:54:15.126079 [INFO] consul: Handled member-join event for server "Node f12f61e5-5947-915c-5cdf-3c7afa4b2a04.dc1" in area "wan"
TestContentTypeIsJSON - 2019/12/30 18:54:15.126859 [INFO] agent: Started DNS server 127.0.0.1:17999 (tcp)
TestContentTypeIsJSON - 2019/12/30 18:54:15.126935 [INFO] agent: Started DNS server 127.0.0.1:17999 (udp)
TestContentTypeIsJSON - 2019/12/30 18:54:15.129555 [INFO] agent: Started HTTP server on 127.0.0.1:18000 (tcp)
TestContentTypeIsJSON - 2019/12/30 18:54:15.129682 [INFO] agent: started state syncer
2019/12/30 18:54:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:15 [INFO]  raft: Node at 127.0.0.1:18004 [Candidate] entering Candidate state in term 2
--- PASS: TestHTTP_wrap_obfuscateLog (2.71s)
=== CONT  TestHTTPAPI_TranslateAddrHeader
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:15.810327 [WARN] agent: Node name "Node 835f7527-48bb-338d-8996-3e36aced34c4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:15.810763 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:15.812961 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:15 [INFO]  raft: Node at 127.0.0.1:18004 [Leader] entering Leader state
TestContentTypeIsJSON - 2019/12/30 18:54:15.944537 [INFO] consul: cluster leadership acquired
TestContentTypeIsJSON - 2019/12/30 18:54:15.944992 [INFO] consul: New leader elected: Node f12f61e5-5947-915c-5cdf-3c7afa4b2a04
TestContentTypeIsJSON - 2019/12/30 18:54:16.022839 [DEBUG] http: Request GET /v1/kv/key (93.003µs) from=
TestContentTypeIsJSON - 2019/12/30 18:54:16.022925 [INFO] agent: Requesting shutdown
TestContentTypeIsJSON - 2019/12/30 18:54:16.022986 [INFO] consul: shutting down server
TestContentTypeIsJSON - 2019/12/30 18:54:16.023034 [WARN] serf: Shutdown without a Leave
TestContentTypeIsJSON - 2019/12/30 18:54:16.090089 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5fdf5af0-c7f2-ced6-d605-fe9ac3292376 Address:127.0.0.1:18010}]
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:18010 [Follower] entering Follower state (Leader: "")
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.095641 [INFO] serf: EventMemberJoin: Node 5fdf5af0-c7f2-ced6-d605-fe9ac3292376.dc1 127.0.0.1
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.101048 [INFO] serf: EventMemberJoin: Node 5fdf5af0-c7f2-ced6-d605-fe9ac3292376 127.0.0.1
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.101951 [INFO] consul: Adding LAN server Node 5fdf5af0-c7f2-ced6-d605-fe9ac3292376 (Addr: tcp/127.0.0.1:18010) (DC: dc1)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.102492 [INFO] consul: Handled member-join event for server "Node 5fdf5af0-c7f2-ced6-d605-fe9ac3292376.dc1" in area "wan"
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.103840 [INFO] agent: Started DNS server 127.0.0.1:18005 (tcp)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.104226 [INFO] agent: Started DNS server 127.0.0.1:18005 (udp)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.107914 [INFO] agent: Started HTTP server on 127.0.0.1:18006 (tcp)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.108154 [INFO] agent: started state syncer
2019/12/30 18:54:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:18010 [Candidate] entering Candidate state in term 2
TestContentTypeIsJSON - 2019/12/30 18:54:16.206800 [INFO] manager: shutting down
TestContentTypeIsJSON - 2019/12/30 18:54:16.382185 [INFO] agent: consul server down
TestContentTypeIsJSON - 2019/12/30 18:54:16.382274 [INFO] agent: shutdown complete
TestContentTypeIsJSON - 2019/12/30 18:54:16.382343 [INFO] agent: Stopping DNS server 127.0.0.1:17999 (tcp)
TestContentTypeIsJSON - 2019/12/30 18:54:16.382587 [INFO] agent: Stopping DNS server 127.0.0.1:17999 (udp)
TestContentTypeIsJSON - 2019/12/30 18:54:16.382761 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestContentTypeIsJSON - 2019/12/30 18:54:16.382780 [INFO] agent: Stopping HTTP server 127.0.0.1:18000 (tcp)
TestContentTypeIsJSON - 2019/12/30 18:54:16.382877 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestContentTypeIsJSON - 2019/12/30 18:54:16.382932 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestContentTypeIsJSON - 2019/12/30 18:54:16.383042 [INFO] agent: Waiting for endpoints to shut down
TestContentTypeIsJSON - 2019/12/30 18:54:16.383119 [INFO] agent: Endpoints down
--- PASS: TestContentTypeIsJSON (2.50s)
=== CONT  TestHTTPAPI_BlockEndpoints
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:16.517614 [WARN] agent: Node name "Node a248976b-f853-3563-cab2-2f75d8100afc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:16.518288 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:16.522310 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:18010 [Leader] entering Leader state
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.742925 [INFO] consul: cluster leadership acquired
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.746340 [INFO] consul: New leader elected: Node 5fdf5af0-c7f2-ced6-d605-fe9ac3292376
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.768557 [DEBUG] http: Request GET /v1/agent/self (5.333µs) from=
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.768648 [INFO] agent: Requesting shutdown
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.768709 [INFO] consul: shutting down server
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.768754 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:835f7527-48bb-338d-8996-3e36aced34c4 Address:127.0.0.1:18016}]
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:18016 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.823060 [INFO] serf: EventMemberJoin: Node 835f7527-48bb-338d-8996-3e36aced34c4.dc1 127.0.0.1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.826575 [INFO] serf: EventMemberJoin: Node 835f7527-48bb-338d-8996-3e36aced34c4 127.0.0.1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.827355 [INFO] consul: Adding LAN server Node 835f7527-48bb-338d-8996-3e36aced34c4 (Addr: tcp/127.0.0.1:18016) (DC: dc1)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.827470 [INFO] consul: Handled member-join event for server "Node 835f7527-48bb-338d-8996-3e36aced34c4.dc1" in area "wan"
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.827951 [INFO] agent: Started DNS server 127.0.0.1:18011 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.828978 [INFO] agent: Started DNS server 127.0.0.1:18011 (udp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.831567 [INFO] agent: Started HTTP server on 127.0.0.1:18012 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:16.831679 [INFO] agent: started state syncer
2019/12/30 18:54:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:18016 [Candidate] entering Candidate state in term 2
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:16.923417 [WARN] serf: Shutdown without a Leave
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.073593 [INFO] manager: shutting down
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.165537 [INFO] agent: consul server down
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.165637 [INFO] agent: shutdown complete
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.165713 [INFO] agent: Stopping DNS server 127.0.0.1:18005 (tcp)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.165900 [INFO] agent: Stopping DNS server 127.0.0.1:18005 (udp)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.166081 [INFO] agent: Stopping HTTP server 127.0.0.1:18006 (tcp)
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.166192 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.166318 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.166374 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.166325 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPIResponseHeaders - 2019/12/30 18:54:17.166514 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPIResponseHeaders (2.25s)
=== CONT  TestSetMeta
--- PASS: TestSetMeta (0.00s)
=== CONT  TestSetLastContact
--- PASS: TestSetLastContact (0.00s)
=== CONT  TestSetKnownLeader
--- PASS: TestSetKnownLeader (0.00s)
=== CONT  TestSetIndex
--- PASS: TestSetIndex (0.00s)
=== CONT  TestHTTPServer_UnixSocket_FileExists
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:17.310487 [WARN] agent: Node name "Node bd29444e-fca7-e341-4d6f-0a660b87f78a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:17.311030 [DEBUG] tlsutil: Update with version 1
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:17.313403 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:17 [INFO]  raft: Node at 127.0.0.1:18016 [Leader] entering Leader state
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.407739 [INFO] consul: cluster leadership acquired
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.408226 [INFO] consul: New leader elected: Node 835f7527-48bb-338d-8996-3e36aced34c4
2019/12/30 18:54:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a248976b-f853-3563-cab2-2f75d8100afc Address:127.0.0.1:18022}]
2019/12/30 18:54:17 [INFO]  raft: Node at 127.0.0.1:18022 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.494213 [INFO] serf: EventMemberJoin: Node a248976b-f853-3563-cab2-2f75d8100afc.dc1 127.0.0.1
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.499879 [INFO] serf: EventMemberJoin: Node a248976b-f853-3563-cab2-2f75d8100afc 127.0.0.1
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.501763 [INFO] consul: Handled member-join event for server "Node a248976b-f853-3563-cab2-2f75d8100afc.dc1" in area "wan"
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.502179 [INFO] agent: Started DNS server 127.0.0.1:18017 (udp)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.502250 [INFO] agent: Started DNS server 127.0.0.1:18017 (tcp)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.502275 [INFO] consul: Adding LAN server Node a248976b-f853-3563-cab2-2f75d8100afc (Addr: tcp/127.0.0.1:18022) (DC: dc1)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.504696 [INFO] agent: Started HTTP server on 127.0.0.1:18018 (tcp)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:17.504803 [INFO] agent: started state syncer
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.515172 [DEBUG] http: Request GET /v1/agent/self (7µs) from=
2019/12/30 18:54:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:17 [INFO]  raft: Node at 127.0.0.1:18022 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.648198 [WARN] agent: Node name "Node 1db368be-08eb-c3b7-d35e-d69550f308ed" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.650907 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.653176 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:17.801544 [INFO] agent: Synced node info
2019/12/30 18:54:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:18 [INFO]  raft: Node at 127.0.0.1:18022 [Leader] entering Leader state
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.184073 [INFO] consul: cluster leadership acquired
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.184682 [INFO] consul: New leader elected: Node a248976b-f853-3563-cab2-2f75d8100afc
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.236441 [ERR] http: Request GET /v1/agent/self, error: <nil> from=
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.236618 [DEBUG] http: Request GET /v1/agent/checks (6.333µs) from=
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.236663 [INFO] agent: Requesting shutdown
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.236727 [INFO] consul: shutting down server
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.236776 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.236858 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:54:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bd29444e-fca7-e341-4d6f-0a660b87f78a Address:127.0.0.1:18028}]
2019/12/30 18:54:18 [INFO]  raft: Node at 127.0.0.1:18028 [Follower] entering Follower state (Leader: "")
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.261349 [INFO] serf: EventMemberJoin: Node bd29444e-fca7-e341-4d6f-0a660b87f78a.dc1 127.0.0.1
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.268116 [INFO] serf: EventMemberJoin: Node bd29444e-fca7-e341-4d6f-0a660b87f78a 127.0.0.1
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.269219 [INFO] consul: Adding LAN server Node bd29444e-fca7-e341-4d6f-0a660b87f78a (Addr: tcp/127.0.0.1:18028) (DC: dc1)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.269224 [INFO] consul: Handled member-join event for server "Node bd29444e-fca7-e341-4d6f-0a660b87f78a.dc1" in area "wan"
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.269851 [INFO] agent: Started DNS server 127.0.0.1:18023 (tcp)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.270206 [INFO] agent: Started DNS server 127.0.0.1:18023 (udp)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.270352 [WARN] agent: Replacing socket "/tmp/consul-test/TestHTTPServer_UnixSocket_FileExists-consul171511516/test.sock"
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.273659 [INFO] agent: Started HTTP server on /tmp/consul-test/TestHTTPServer_UnixSocket_FileExists-consul171511516/test.sock (unix)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.273763 [INFO] agent: started state syncer
2019/12/30 18:54:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:18 [INFO]  raft: Node at 127.0.0.1:18028 [Candidate] entering Candidate state in term 2
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.340171 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.418840 [INFO] manager: shutting down
2019/12/30 18:54:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1db368be-08eb-c3b7-d35e-d69550f308ed Address:127.0.0.1:18034}]
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.584934 [INFO] agent: consul server down
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.584992 [INFO] agent: shutdown complete
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.585049 [INFO] agent: Stopping DNS server 127.0.0.1:18017 (tcp)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.585222 [INFO] agent: Stopping DNS server 127.0.0.1:18017 (udp)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.585404 [INFO] agent: Stopping HTTP server 127.0.0.1:18018 (tcp)
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.585626 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.585707 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_BlockEndpoints (2.20s)
2019/12/30 18:54:18 [INFO]  raft: Node at 127.0.0.1:18034 [Follower] entering Follower state (Leader: "")
=== CONT  TestHTTPServer_UnixSocket
TestHTTPAPI_BlockEndpoints - 2019/12/30 18:54:18.587170 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.589031 [INFO] serf: EventMemberJoin: Node 1db368be-08eb-c3b7-d35e-d69550f308ed.dc1 127.0.0.1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.592291 [INFO] serf: EventMemberJoin: Node 1db368be-08eb-c3b7-d35e-d69550f308ed 127.0.0.1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.593543 [INFO] agent: Started DNS server 127.0.0.1:18029 (udp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.594744 [INFO] consul: Adding LAN server Node 1db368be-08eb-c3b7-d35e-d69550f308ed (Addr: tcp/127.0.0.1:18034) (DC: dc1)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.595450 [INFO] agent: Started DNS server 127.0.0.1:18029 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.598016 [INFO] agent: Started HTTP server on 127.0.0.1:18030 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.598138 [INFO] agent: started state syncer
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.598791 [INFO] consul: Handled member-join event for server "Node 1db368be-08eb-c3b7-d35e-d69550f308ed.dc1" in area "wan"
2019/12/30 18:54:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:18 [INFO]  raft: Node at 127.0.0.1:18034 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPServer_UnixSocket - 2019/12/30 18:54:18.664785 [WARN] agent: Node name "Node 971d3437-d004-b8a5-f3a0-08e3b3d9127d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPServer_UnixSocket - 2019/12/30 18:54:18.665332 [DEBUG] tlsutil: Update with version 1
TestHTTPServer_UnixSocket - 2019/12/30 18:54:18.667563 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:18 [INFO]  raft: Node at 127.0.0.1:18028 [Leader] entering Leader state
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.858274 [INFO] consul: cluster leadership acquired
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:18.858930 [INFO] consul: New leader elected: Node bd29444e-fca7-e341-4d6f-0a660b87f78a
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.926452 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.927467 [DEBUG] consul: Skipping self join check for "Node 835f7527-48bb-338d-8996-3e36aced34c4" since the cluster is too small
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:18.928875 [INFO] consul: member 'Node 835f7527-48bb-338d-8996-3e36aced34c4' joined, marking health alive
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.006051 [INFO] agent: Requesting shutdown
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.006155 [INFO] consul: shutting down server
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.006268 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.006800 [ERR] agent: failed to sync remote state: No cluster leader
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.097159 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.181489 [INFO] manager: shutting down
2019/12/30 18:54:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:19 [INFO]  raft: Node at 127.0.0.1:18034 [Leader] entering Leader state
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.277563 [INFO] consul: cluster leadership acquired
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.278032 [INFO] consul: New leader elected: Node 1db368be-08eb-c3b7-d35e-d69550f308ed
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.348420 [DEBUG] http: Request GET /v1/agent/self (7.333µs) from=
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.348519 [INFO] agent: Requesting shutdown
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.348594 [INFO] consul: shutting down server
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.348657 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.351818 [INFO] agent: consul server down
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.351907 [INFO] agent: shutdown complete
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.351964 [INFO] agent: Stopping DNS server 127.0.0.1:18023 (tcp)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.352131 [INFO] agent: Stopping DNS server 127.0.0.1:18023 (udp)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.353194 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.353592 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.353844 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.354374 [INFO] agent: Stopping HTTP server /tmp/consul-test/TestHTTPServer_UnixSocket_FileExists-consul171511516/test.sock (unix)
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.355459 [INFO] agent: Waiting for endpoints to shut down
TestHTTPServer_UnixSocket_FileExists - 2019/12/30 18:54:19.355563 [INFO] agent: Endpoints down
--- PASS: TestHTTPServer_UnixSocket_FileExists (2.19s)
=== CONT  TestFilterNonPassing
--- PASS: TestFilterNonPassing (0.00s)
=== CONT  TestHealthConnectServiceNodes_PassingFilter
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:19.424527 [WARN] agent: Node name "Node f18f5551-f314-1e04-2b5e-9fa7956004d3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:19.425701 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:19.428176 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.440178 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.536413 [INFO] manager: shutting down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.615580 [INFO] agent: consul server down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.615683 [INFO] agent: shutdown complete
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.615767 [INFO] agent: Stopping DNS server 127.0.0.1:18029 (tcp)
2019/12/30 18:54:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:971d3437-d004-b8a5-f3a0-08e3b3d9127d Address:127.0.0.1:18040}]
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.615967 [INFO] agent: Stopping DNS server 127.0.0.1:18029 (udp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616147 [INFO] agent: Stopping HTTP server 127.0.0.1:18030 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616290 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616389 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616434 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616471 [INFO] agent: Endpoints down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616519 [INFO] agent: Requesting shutdown
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616571 [INFO] consul: shutting down server
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616633 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:19 [INFO]  raft: Node at 127.0.0.1:18040 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.616482 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.620388 [INFO] serf: EventMemberJoin: Node 971d3437-d004-b8a5-f3a0-08e3b3d9127d.dc1 127.0.0.1
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.623837 [INFO] serf: EventMemberJoin: Node 971d3437-d004-b8a5-f3a0-08e3b3d9127d 127.0.0.1
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.624750 [INFO] consul: Handled member-join event for server "Node 971d3437-d004-b8a5-f3a0-08e3b3d9127d.dc1" in area "wan"
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.624942 [INFO] consul: Adding LAN server Node 971d3437-d004-b8a5-f3a0-08e3b3d9127d (Addr: tcp/127.0.0.1:18040) (DC: dc1)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.625575 [INFO] agent: Started DNS server 127.0.0.1:18035 (tcp)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.625955 [INFO] agent: Started DNS server 127.0.0.1:18035 (udp)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.628740 [INFO] agent: Started HTTP server on /tmp/consul-test/TestHTTPServer_UnixSocket-consul772822773/test.sock (unix)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:19.628881 [INFO] agent: started state syncer
2019/12/30 18:54:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:19 [INFO]  raft: Node at 127.0.0.1:18040 [Candidate] entering Candidate state in term 2
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.715240 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.881963 [INFO] manager: shutting down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.882551 [INFO] agent: consul server down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.882646 [INFO] agent: shutdown complete
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.882786 [INFO] agent: Stopping DNS server 127.0.0.1:18011 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.883000 [INFO] agent: Stopping DNS server 127.0.0.1:18011 (udp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.883205 [INFO] agent: Stopping HTTP server 127.0.0.1:18012 (tcp)
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.883459 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_TranslateAddrHeader - 2019/12/30 18:54:19.883552 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_TranslateAddrHeader (4.13s)
=== CONT  TestHealthConnectServiceNodes_Filter
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:20.054944 [WARN] agent: Node name "Node 355cd325-22f2-00fe-765b-f1f9af4c7668" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:20.055403 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:20.067899 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:20 [INFO]  raft: Node at 127.0.0.1:18040 [Leader] entering Leader state
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.350023 [INFO] consul: cluster leadership acquired
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.351120 [INFO] consul: New leader elected: Node 971d3437-d004-b8a5-f3a0-08e3b3d9127d
2019/12/30 18:54:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f18f5551-f314-1e04-2b5e-9fa7956004d3 Address:127.0.0.1:18046}]
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.528692 [INFO] serf: EventMemberJoin: Node f18f5551-f314-1e04-2b5e-9fa7956004d3.dc1 127.0.0.1
2019/12/30 18:54:20 [INFO]  raft: Node at 127.0.0.1:18046 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.532492 [INFO] serf: EventMemberJoin: Node f18f5551-f314-1e04-2b5e-9fa7956004d3 127.0.0.1
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.534963 [INFO] agent: Started DNS server 127.0.0.1:18041 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.535304 [INFO] consul: Adding LAN server Node f18f5551-f314-1e04-2b5e-9fa7956004d3 (Addr: tcp/127.0.0.1:18046) (DC: dc1)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.535423 [INFO] agent: Started DNS server 127.0.0.1:18041 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.535528 [INFO] consul: Handled member-join event for server "Node f18f5551-f314-1e04-2b5e-9fa7956004d3.dc1" in area "wan"
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.537985 [INFO] agent: Started HTTP server on 127.0.0.1:18042 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:20.538319 [INFO] agent: started state syncer
2019/12/30 18:54:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:20 [INFO]  raft: Node at 127.0.0.1:18046 [Candidate] entering Candidate state in term 2
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.724599 [INFO] agent: Synced node info
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.729868 [DEBUG] http: Request GET /v1/agent/self (97.100235ms) from=@
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.743518 [INFO] agent: Requesting shutdown
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.743610 [INFO] consul: shutting down server
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.743675 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.890263 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/12/30 18:54:20.974730 [INFO] manager: shutting down
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.182447 [INFO] agent: consul server down
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.182564 [INFO] agent: shutdown complete
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.182641 [INFO] agent: Stopping DNS server 127.0.0.1:18035 (tcp)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.182865 [INFO] agent: Stopping DNS server 127.0.0.1:18035 (udp)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.183092 [INFO] agent: Stopping HTTP server /tmp/consul-test/TestHTTPServer_UnixSocket-consul772822773/test.sock (unix)
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.183876 [INFO] agent: Waiting for endpoints to shut down
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.183966 [INFO] agent: Endpoints down
TestHTTPServer_UnixSocket - 2019/12/30 18:54:21.184262 [ERR] consul: failed to establish leadership: leadership lost while committing log
--- PASS: TestHTTPServer_UnixSocket (2.60s)
=== CONT  TestHealthConnectServiceNodes
2019/12/30 18:54:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:355cd325-22f2-00fe-765b-f1f9af4c7668 Address:127.0.0.1:18052}]
2019/12/30 18:54:21 [INFO]  raft: Node at 127.0.0.1:18052 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.196130 [INFO] serf: EventMemberJoin: Node 355cd325-22f2-00fe-765b-f1f9af4c7668.dc1 127.0.0.1
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.201211 [INFO] serf: EventMemberJoin: Node 355cd325-22f2-00fe-765b-f1f9af4c7668 127.0.0.1
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.202227 [INFO] consul: Adding LAN server Node 355cd325-22f2-00fe-765b-f1f9af4c7668 (Addr: tcp/127.0.0.1:18052) (DC: dc1)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.202820 [INFO] consul: Handled member-join event for server "Node 355cd325-22f2-00fe-765b-f1f9af4c7668.dc1" in area "wan"
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.204457 [INFO] agent: Started DNS server 127.0.0.1:18047 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.204559 [INFO] agent: Started DNS server 127.0.0.1:18047 (udp)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.207813 [INFO] agent: Started HTTP server on 127.0.0.1:18048 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.208595 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes - 2019/12/30 18:54:21.250534 [WARN] agent: Node name "Node 4e629d1a-65b1-3abb-c299-5edb1e0ff188" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes - 2019/12/30 18:54:21.251089 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes - 2019/12/30 18:54:21.253296 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:21 [INFO]  raft: Node at 127.0.0.1:18052 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:21 [INFO]  raft: Node at 127.0.0.1:18046 [Leader] entering Leader state
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.260639 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.261184 [INFO] consul: New leader elected: Node f18f5551-f314-1e04-2b5e-9fa7956004d3
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.867694 [INFO] agent: Synced node info
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.869893 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.869982 [INFO] consul: shutting down server
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.870029 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:21 [INFO]  raft: Node at 127.0.0.1:18052 [Leader] entering Leader state
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.872210 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:21.872740 [INFO] consul: New leader elected: Node 355cd325-22f2-00fe-765b-f1f9af4c7668
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:21.961984 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.075243 [INFO] manager: shutting down
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.165538 [INFO] agent: consul server down
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.165633 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.165699 [INFO] agent: Stopping DNS server 127.0.0.1:18041 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.165864 [INFO] agent: Stopping DNS server 127.0.0.1:18041 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.165985 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.166043 [INFO] agent: Stopping HTTP server 127.0.0.1:18042 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.166306 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.166375 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.166387 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes_PassingFilter (2.81s)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/30 18:54:22.166563 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
=== CONT  TestHealthServiceNodes_WanTranslation
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:22.241746 [INFO] agent: Synced node info
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:22.241896 [DEBUG] agent: Node info in sync
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:22.252736 [WARN] agent: Node name "Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:22.253247 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:22.256665 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e629d1a-65b1-3abb-c299-5edb1e0ff188 Address:127.0.0.1:18058}]
2019/12/30 18:54:22 [INFO]  raft: Node at 127.0.0.1:18058 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.356258 [INFO] serf: EventMemberJoin: Node 4e629d1a-65b1-3abb-c299-5edb1e0ff188.dc1 127.0.0.1
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.364097 [INFO] serf: EventMemberJoin: Node 4e629d1a-65b1-3abb-c299-5edb1e0ff188 127.0.0.1
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.366118 [INFO] consul: Adding LAN server Node 4e629d1a-65b1-3abb-c299-5edb1e0ff188 (Addr: tcp/127.0.0.1:18058) (DC: dc1)
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.367292 [INFO] consul: Handled member-join event for server "Node 4e629d1a-65b1-3abb-c299-5edb1e0ff188.dc1" in area "wan"
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.368545 [INFO] agent: Started DNS server 127.0.0.1:18053 (tcp)
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.368651 [INFO] agent: Started DNS server 127.0.0.1:18053 (udp)
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.371339 [INFO] agent: Started HTTP server on 127.0.0.1:18054 (tcp)
TestHealthConnectServiceNodes - 2019/12/30 18:54:22.371524 [INFO] agent: started state syncer
2019/12/30 18:54:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:22 [INFO]  raft: Node at 127.0.0.1:18058 [Candidate] entering Candidate state in term 2
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:22.947116 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:22.947332 [INFO] consul: shutting down server
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:22.947589 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.015318 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:23 [INFO]  raft: Node at 127.0.0.1:18058 [Leader] entering Leader state
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.019288 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.019865 [INFO] consul: New leader elected: Node 4e629d1a-65b1-3abb-c299-5edb1e0ff188
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.082012 [INFO] manager: shutting down
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.082532 [INFO] agent: consul server down
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.082586 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.082640 [INFO] agent: Stopping DNS server 127.0.0.1:18047 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.082793 [INFO] agent: Stopping DNS server 127.0.0.1:18047 (udp)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.082953 [INFO] agent: Stopping HTTP server 127.0.0.1:18048 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.083139 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.083205 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes_Filter (3.20s)
=== CONT  TestHealthServiceNodes_DistanceSort
TestHealthConnectServiceNodes_Filter - 2019/12/30 18:54:23.089062 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:23.264175 [WARN] agent: Node name "Node bad55930-ad41-2fc8-03e3-eb79a1842760" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:23.266377 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:23.270779 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:90af8a09-fb3c-7a1e-5251-cc688dc5413b Address:127.0.0.1:18064}]
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.303130 [INFO] serf: EventMemberJoin: Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b.dc1 127.0.0.1
2019/12/30 18:54:23 [INFO]  raft: Node at 127.0.0.1:18064 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.321436 [INFO] serf: EventMemberJoin: Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.323714 [INFO] consul: Adding LAN server Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b (Addr: tcp/127.0.0.1:18064) (DC: dc1)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.326629 [INFO] consul: Handled member-join event for server "Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b.dc1" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.328217 [INFO] agent: Started DNS server 127.0.0.1:18059 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.328732 [INFO] agent: Started DNS server 127.0.0.1:18059 (udp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.331349 [INFO] agent: Started HTTP server on 127.0.0.1:18060 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:23.331453 [INFO] agent: started state syncer
2019/12/30 18:54:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:23 [INFO]  raft: Node at 127.0.0.1:18064 [Candidate] entering Candidate state in term 2
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.851027 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.851138 [INFO] consul: shutting down server
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.851194 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.851151 [INFO] agent: Synced node info
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.851610 [DEBUG] agent: Node info in sync
TestHealthConnectServiceNodes - 2019/12/30 18:54:23.941339 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.011751 [DEBUG] agent: Node info in sync
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.023743 [INFO] manager: shutting down
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.024706 [INFO] agent: consul server down
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.024773 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.024853 [INFO] agent: Stopping DNS server 127.0.0.1:18053 (tcp)
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.025031 [INFO] agent: Stopping DNS server 127.0.0.1:18053 (udp)
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.025235 [INFO] agent: Stopping HTTP server 127.0.0.1:18054 (tcp)
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.025463 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.025577 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes (2.84s)
=== CONT  TestHealthServiceNodes_NodeMetaFilter
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.026670 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.026896 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHealthConnectServiceNodes - 2019/12/30 18:54:24.026957 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.083287 [WARN] agent: Node name "Node b77cdb8a-72bd-a090-172b-e9e691afb57f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.083718 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.086075 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:24 [INFO]  raft: Node at 127.0.0.1:18064 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.091555 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.092030 [INFO] consul: New leader elected: Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.122201 [ERR] agent: failed to sync remote state: ACL not found
2019/12/30 18:54:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bad55930-ad41-2fc8-03e3-eb79a1842760 Address:127.0.0.1:18070}]
2019/12/30 18:54:24 [INFO]  raft: Node at 127.0.0.1:18070 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.321442 [INFO] serf: EventMemberJoin: Node bad55930-ad41-2fc8-03e3-eb79a1842760.dc1 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.329967 [INFO] serf: EventMemberJoin: Node bad55930-ad41-2fc8-03e3-eb79a1842760 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.331257 [INFO] consul: Adding LAN server Node bad55930-ad41-2fc8-03e3-eb79a1842760 (Addr: tcp/127.0.0.1:18070) (DC: dc1)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.331928 [INFO] consul: Handled member-join event for server "Node bad55930-ad41-2fc8-03e3-eb79a1842760.dc1" in area "wan"
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.335612 [INFO] agent: Started DNS server 127.0.0.1:18065 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.336010 [INFO] agent: Started DNS server 127.0.0.1:18065 (udp)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.338388 [INFO] agent: Started HTTP server on 127.0.0.1:18066 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:24.338492 [INFO] agent: started state syncer
2019/12/30 18:54:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:24 [INFO]  raft: Node at 127.0.0.1:18070 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.407188 [INFO] acl: initializing acls
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.558662 [INFO] consul: Created ACL 'global-management' policy
jones - 2019/12/30 18:54:24.607293 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:24.607374 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.703461 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.704590 [INFO] serf: EventMemberUpdate: Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:24.705307 [INFO] serf: EventMemberUpdate: Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b.dc1
2019/12/30 18:54:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b77cdb8a-72bd-a090-172b-e9e691afb57f Address:127.0.0.1:18076}]
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.903070 [INFO] serf: EventMemberJoin: Node b77cdb8a-72bd-a090-172b-e9e691afb57f.dc1 127.0.0.1
2019/12/30 18:54:24 [INFO]  raft: Node at 127.0.0.1:18076 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.915787 [INFO] serf: EventMemberJoin: Node b77cdb8a-72bd-a090-172b-e9e691afb57f 127.0.0.1
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.917220 [INFO] consul: Adding LAN server Node b77cdb8a-72bd-a090-172b-e9e691afb57f (Addr: tcp/127.0.0.1:18076) (DC: dc1)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.918968 [INFO] consul: Handled member-join event for server "Node b77cdb8a-72bd-a090-172b-e9e691afb57f.dc1" in area "wan"
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.920002 [INFO] agent: Started DNS server 127.0.0.1:18071 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.920551 [INFO] agent: Started DNS server 127.0.0.1:18071 (udp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.922940 [INFO] agent: Started HTTP server on 127.0.0.1:18072 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:24.923051 [INFO] agent: started state syncer
2019/12/30 18:54:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:24 [INFO]  raft: Node at 127.0.0.1:18076 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:25 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:25 [INFO]  raft: Node at 127.0.0.1:18070 [Leader] entering Leader state
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:25.060209 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:25.060671 [INFO] consul: New leader elected: Node bad55930-ad41-2fc8-03e3-eb79a1842760
2019/12/30 18:54:25 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:25 [INFO]  raft: Node at 127.0.0.1:18076 [Leader] entering Leader state
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:25.485359 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:25.485841 [INFO] consul: New leader elected: Node b77cdb8a-72bd-a090-172b-e9e691afb57f
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:25.632720 [INFO] agent: Synced node info
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:25.632848 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:25.774313 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:25.774887 [DEBUG] consul: Skipping self join check for "Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:25.774996 [INFO] consul: member 'Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b' joined, marking health alive
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:25.858179 [INFO] agent: Synced node info
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:25.942616 [DEBUG] consul: Skipping self join check for "Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b" since the cluster is too small
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.092788 [WARN] agent: Node name "Node 3cc821cc-c347-48e8-d82a-2c9acd583da2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.093525 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.097912 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:54:26.193283 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:26.193395 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.351657 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.351760 [INFO] consul: shutting down server
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.351812 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:26.407356 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.431976 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.511927 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.515562 [INFO] manager: shutting down
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.599013 [INFO] agent: consul server down
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.599100 [INFO] agent: shutdown complete
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.599174 [INFO] agent: Stopping DNS server 127.0.0.1:18071 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.599468 [INFO] agent: Stopping DNS server 127.0.0.1:18071 (udp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.599726 [INFO] agent: Stopping HTTP server 127.0.0.1:18072 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.599995 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.600082 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_NodeMetaFilter (2.57s)
=== CONT  TestHealthServiceNodes
TestHealthServiceNodes_NodeMetaFilter - 2019/12/30 18:54:26.609561 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes - 2019/12/30 18:54:26.705336 [WARN] agent: Node name "Node a2346663-7454-6a18-a63c-440d920ccb75" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes - 2019/12/30 18:54:26.706079 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes - 2019/12/30 18:54:26.708621 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.791781 [INFO] agent: Synced node info
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.791965 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:26.879085 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:26.879233 [INFO] consul: shutting down server
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:26.879299 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3cc821cc-c347-48e8-d82a-2c9acd583da2 Address:127.0.0.1:18082}]
2019/12/30 18:54:26 [INFO]  raft: Node at 127.0.0.1:18082 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.961530 [INFO] serf: EventMemberJoin: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.968856 [INFO] serf: EventMemberJoin: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.971838 [INFO] consul: Adding LAN server Node 3cc821cc-c347-48e8-d82a-2c9acd583da2 (Addr: tcp/127.0.0.1:18082) (DC: dc2)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.974773 [INFO] consul: Handled member-join event for server "Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.976997 [INFO] agent: Started DNS server 127.0.0.1:18077 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.977568 [INFO] agent: Started DNS server 127.0.0.1:18077 (udp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.980904 [INFO] agent: Started HTTP server on 127.0.0.1:18078 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:26.981029 [INFO] agent: started state syncer
2019/12/30 18:54:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:27 [INFO]  raft: Node at 127.0.0.1:18082 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.051917 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:27.061053 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.140519 [INFO] manager: shutting down
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.141706 [INFO] agent: consul server down
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.141783 [INFO] agent: shutdown complete
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.141847 [INFO] agent: Stopping DNS server 127.0.0.1:18065 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.142001 [INFO] agent: Stopping DNS server 127.0.0.1:18065 (udp)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.142181 [INFO] agent: Stopping HTTP server 127.0.0.1:18066 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.142404 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.142481 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_DistanceSort (4.06s)
=== CONT  TestHealthServiceChecks_DistanceSort
TestHealthServiceNodes_DistanceSort - 2019/12/30 18:54:27.144190 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:27.239083 [WARN] agent: Node name "Node 007b9218-e8d4-033e-f74c-066597b0bdfe" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:27.241983 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:27.244651 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:28 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:28 [INFO]  raft: Node at 127.0.0.1:18082 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:28.115901 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:28.116358 [INFO] consul: New leader elected: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2
2019/12/30 18:54:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a2346663-7454-6a18-a63c-440d920ccb75 Address:127.0.0.1:18088}]
2019/12/30 18:54:28 [INFO]  raft: Node at 127.0.0.1:18088 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes - 2019/12/30 18:54:28.152453 [INFO] serf: EventMemberJoin: Node a2346663-7454-6a18-a63c-440d920ccb75.dc1 127.0.0.1
TestHealthServiceNodes - 2019/12/30 18:54:28.161537 [INFO] serf: EventMemberJoin: Node a2346663-7454-6a18-a63c-440d920ccb75 127.0.0.1
TestHealthServiceNodes - 2019/12/30 18:54:28.167141 [INFO] consul: Adding LAN server Node a2346663-7454-6a18-a63c-440d920ccb75 (Addr: tcp/127.0.0.1:18088) (DC: dc1)
TestHealthServiceNodes - 2019/12/30 18:54:28.168895 [INFO] consul: Handled member-join event for server "Node a2346663-7454-6a18-a63c-440d920ccb75.dc1" in area "wan"
TestHealthServiceNodes - 2019/12/30 18:54:28.170886 [INFO] agent: Started DNS server 127.0.0.1:18083 (udp)
TestHealthServiceNodes - 2019/12/30 18:54:28.171251 [INFO] agent: Started DNS server 127.0.0.1:18083 (tcp)
TestHealthServiceNodes - 2019/12/30 18:54:28.177604 [INFO] agent: Started HTTP server on 127.0.0.1:18084 (tcp)
TestHealthServiceNodes - 2019/12/30 18:54:28.177726 [INFO] agent: started state syncer
2019/12/30 18:54:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:28 [INFO]  raft: Node at 127.0.0.1:18088 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:28.214835 [ERR] agent: failed to sync remote state: ACL not found
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:28.529822 [INFO] acl: initializing acls
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.016782 [ERR] agent: failed to sync remote state: ACL not found
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.452724 [INFO] consul: Created ACL 'global-management' policy
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.454941 [INFO] acl: initializing acls
2019/12/30 18:54:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:29 [INFO]  raft: Node at 127.0.0.1:18088 [Leader] entering Leader state
2019/12/30 18:54:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:007b9218-e8d4-033e-f74c-066597b0bdfe Address:127.0.0.1:18094}]
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.758493 [INFO] serf: EventMemberJoin: Node 007b9218-e8d4-033e-f74c-066597b0bdfe.dc1 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.777894 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.778036 [DEBUG] acl: transitioning out of legacy ACL mode
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.778904 [INFO] serf: EventMemberUpdate: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:29.779902 [INFO] serf: EventMemberUpdate: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
2019/12/30 18:54:29 [INFO]  raft: Node at 127.0.0.1:18094 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes - 2019/12/30 18:54:29.781997 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes - 2019/12/30 18:54:29.782473 [INFO] consul: New leader elected: Node a2346663-7454-6a18-a63c-440d920ccb75
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.793731 [INFO] serf: EventMemberJoin: Node 007b9218-e8d4-033e-f74c-066597b0bdfe 127.0.0.1
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.796446 [INFO] consul: Adding LAN server Node 007b9218-e8d4-033e-f74c-066597b0bdfe (Addr: tcp/127.0.0.1:18094) (DC: dc1)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.797037 [INFO] consul: Handled member-join event for server "Node 007b9218-e8d4-033e-f74c-066597b0bdfe.dc1" in area "wan"
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.798367 [INFO] agent: Started DNS server 127.0.0.1:18089 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.798884 [INFO] agent: Started DNS server 127.0.0.1:18089 (udp)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.803578 [INFO] agent: Started HTTP server on 127.0.0.1:18090 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:29.803691 [INFO] agent: started state syncer
2019/12/30 18:54:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:29 [INFO]  raft: Node at 127.0.0.1:18094 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:54:29.987793 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:29.987893 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:30.387028 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:30.387940 [INFO] serf: EventMemberUpdate: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:30.388602 [INFO] serf: EventMemberUpdate: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes - 2019/12/30 18:54:30.389184 [INFO] agent: Synced node info
TestHealthServiceNodes - 2019/12/30 18:54:30.389673 [DEBUG] agent: Node info in sync
2019/12/30 18:54:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:31 [INFO]  raft: Node at 127.0.0.1:18094 [Leader] entering Leader state
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:31.002363 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:31.002925 [INFO] consul: New leader elected: Node 007b9218-e8d4-033e-f74c-066597b0bdfe
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:31.524818 [INFO] agent: Synced node info
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:31.524948 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.682826 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.683349 [DEBUG] consul: Skipping self join check for "Node 3cc821cc-c347-48e8-d82a-2c9acd583da2" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.683459 [INFO] consul: member 'Node 3cc821cc-c347-48e8-d82a-2c9acd583da2' joined, marking health alive
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:31.863551 [DEBUG] agent: Node info in sync
TestHealthServiceNodes - 2019/12/30 18:54:31.899792 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes - 2019/12/30 18:54:31.900280 [DEBUG] consul: Skipping self join check for "Node a2346663-7454-6a18-a63c-440d920ccb75" since the cluster is too small
TestHealthServiceNodes - 2019/12/30 18:54:31.900453 [INFO] consul: member 'Node a2346663-7454-6a18-a63c-440d920ccb75' joined, marking health alive
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.901413 [DEBUG] consul: Skipping self join check for "Node 3cc821cc-c347-48e8-d82a-2c9acd583da2" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.901984 [DEBUG] consul: Skipping self join check for "Node 3cc821cc-c347-48e8-d82a-2c9acd583da2" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.914643 [INFO] agent: (WAN) joining: [127.0.0.1:18063]
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.915587 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:18063
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.915940 [DEBUG] memberlist: Stream connection from=127.0.0.1:48718
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.919299 [INFO] serf: EventMemberJoin: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.920150 [INFO] serf: EventMemberJoin: Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b.dc1 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.920188 [INFO] consul: Handled member-join event for server "Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.920676 [INFO] agent: (WAN) joined: 1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.921021 [INFO] consul: Handled member-join event for server "Node 90af8a09-fb3c-7a1e-5251-cc688dc5413b.dc1" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.964638 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.965914 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:31.966904 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.275550 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.308373 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.308654 [INFO] consul: shutting down server
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.309095 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.316608 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.464006 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.855388 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:32.963411 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes - 2019/12/30 18:54:33.266898 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestHealthServiceNodes - 2019/12/30 18:54:33.266978 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.316313 [DEBUG] serf: messageJoinType: Node 3cc821cc-c347-48e8-d82a-2c9acd583da2.dc2
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.348854 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes - 2019/12/30 18:54:33.352906 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.449009 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.449105 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.449698 [INFO] agent: consul server down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.449771 [INFO] agent: shutdown complete
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.449828 [INFO] agent: Stopping DNS server 127.0.0.1:18077 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450006 [INFO] agent: Stopping DNS server 127.0.0.1:18077 (udp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450189 [INFO] agent: Stopping HTTP server 127.0.0.1:18078 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450399 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450475 [INFO] agent: Endpoints down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450524 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450590 [INFO] consul: shutting down server
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.450642 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.515472 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.598915 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.599012 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600026 [INFO] agent: consul server down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600092 [INFO] agent: shutdown complete
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600150 [INFO] agent: Stopping DNS server 127.0.0.1:18059 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600297 [INFO] agent: Stopping DNS server 127.0.0.1:18059 (udp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600447 [INFO] agent: Stopping HTTP server 127.0.0.1:18060 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600637 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_WanTranslation - 2019/12/30 18:54:33.600708 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_WanTranslation (11.43s)
=== CONT  TestHealthServiceChecks_Filtering
TestHealthServiceNodes - 2019/12/30 18:54:33.627196 [INFO] agent: Requesting shutdown
TestHealthServiceNodes - 2019/12/30 18:54:33.627320 [INFO] consul: shutting down server
TestHealthServiceNodes - 2019/12/30 18:54:33.627384 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:33.693256 [WARN] agent: Node name "Node a51024dc-44d4-72c9-ef78-3e916be03051" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:33.693775 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:33.696451 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes - 2019/12/30 18:54:33.882144 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes - 2019/12/30 18:54:33.950323 [INFO] manager: shutting down
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:33.950919 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes - 2019/12/30 18:54:33.951114 [INFO] agent: consul server down
TestHealthServiceNodes - 2019/12/30 18:54:33.951168 [INFO] agent: shutdown complete
TestHealthServiceNodes - 2019/12/30 18:54:33.951230 [INFO] agent: Stopping DNS server 127.0.0.1:18083 (tcp)
TestHealthServiceNodes - 2019/12/30 18:54:33.951411 [INFO] agent: Stopping DNS server 127.0.0.1:18083 (udp)
TestHealthServiceNodes - 2019/12/30 18:54:33.951587 [INFO] agent: Stopping HTTP server 127.0.0.1:18084 (tcp)
TestHealthServiceNodes - 2019/12/30 18:54:33.951800 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes - 2019/12/30 18:54:33.951876 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes (7.35s)
=== CONT  TestHealthServiceChecks_NodeMetaFilter
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:33.953054 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:33.953589 [DEBUG] consul: Skipping self join check for "Node 007b9218-e8d4-033e-f74c-066597b0bdfe" since the cluster is too small
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:33.953739 [INFO] consul: member 'Node 007b9218-e8d4-033e-f74c-066597b0bdfe' joined, marking health alive
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:34.101059 [WARN] agent: Node name "Node 956903df-d93e-17e6-5cd4-715fca68f962" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:34.101692 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:34.107299 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a51024dc-44d4-72c9-ef78-3e916be03051 Address:127.0.0.1:18100}]
2019/12/30 18:54:35 [INFO]  raft: Node at 127.0.0.1:18100 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.062947 [INFO] serf: EventMemberJoin: Node a51024dc-44d4-72c9-ef78-3e916be03051.dc1 127.0.0.1
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.068938 [INFO] serf: EventMemberJoin: Node a51024dc-44d4-72c9-ef78-3e916be03051 127.0.0.1
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.070325 [INFO] consul: Adding LAN server Node a51024dc-44d4-72c9-ef78-3e916be03051 (Addr: tcp/127.0.0.1:18100) (DC: dc1)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.070429 [INFO] consul: Handled member-join event for server "Node a51024dc-44d4-72c9-ef78-3e916be03051.dc1" in area "wan"
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.071465 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.071541 [INFO] consul: shutting down server
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.071591 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.073516 [INFO] agent: Started DNS server 127.0.0.1:18095 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.073885 [INFO] agent: Started DNS server 127.0.0.1:18095 (udp)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.076341 [INFO] agent: Started HTTP server on 127.0.0.1:18096 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.076452 [INFO] agent: started state syncer
2019/12/30 18:54:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:35 [INFO]  raft: Node at 127.0.0.1:18100 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.198857 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:956903df-d93e-17e6-5cd4-715fca68f962 Address:127.0.0.1:18106}]
2019/12/30 18:54:35 [INFO]  raft: Node at 127.0.0.1:18106 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.203430 [INFO] serf: EventMemberJoin: Node 956903df-d93e-17e6-5cd4-715fca68f962.dc1 127.0.0.1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.209084 [INFO] serf: EventMemberJoin: Node 956903df-d93e-17e6-5cd4-715fca68f962 127.0.0.1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.211092 [INFO] consul: Adding LAN server Node 956903df-d93e-17e6-5cd4-715fca68f962 (Addr: tcp/127.0.0.1:18106) (DC: dc1)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.211778 [INFO] consul: Handled member-join event for server "Node 956903df-d93e-17e6-5cd4-715fca68f962.dc1" in area "wan"
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.213491 [INFO] agent: Started DNS server 127.0.0.1:18101 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.216422 [INFO] agent: Started DNS server 127.0.0.1:18101 (udp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.225575 [INFO] agent: Started HTTP server on 127.0.0.1:18102 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.226043 [INFO] agent: started state syncer
2019/12/30 18:54:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:35 [INFO]  raft: Node at 127.0.0.1:18106 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.275329 [INFO] manager: shutting down
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.275982 [INFO] agent: consul server down
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.276031 [INFO] agent: shutdown complete
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.276082 [INFO] agent: Stopping DNS server 127.0.0.1:18089 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.276230 [INFO] agent: Stopping DNS server 127.0.0.1:18089 (udp)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.276381 [INFO] agent: Stopping HTTP server 127.0.0.1:18090 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.276572 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_DistanceSort - 2019/12/30 18:54:35.276636 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_DistanceSort (8.13s)
=== CONT  TestHealthServiceChecks
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks - 2019/12/30 18:54:35.408337 [WARN] agent: Node name "Node 51206dd2-d56f-131b-7c0b-139785889449" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks - 2019/12/30 18:54:35.408996 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks - 2019/12/30 18:54:35.413776 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:35 [INFO]  raft: Node at 127.0.0.1:18100 [Leader] entering Leader state
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.699908 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:35.700450 [INFO] consul: New leader elected: Node a51024dc-44d4-72c9-ef78-3e916be03051
2019/12/30 18:54:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:35 [INFO]  raft: Node at 127.0.0.1:18106 [Leader] entering Leader state
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.873567 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:35.873955 [INFO] consul: New leader elected: Node 956903df-d93e-17e6-5cd4-715fca68f962
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:36.008269 [INFO] agent: Synced node info
2019/12/30 18:54:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:51206dd2-d56f-131b-7c0b-139785889449 Address:127.0.0.1:18112}]
2019/12/30 18:54:36 [INFO]  raft: Node at 127.0.0.1:18112 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks - 2019/12/30 18:54:36.413269 [INFO] serf: EventMemberJoin: Node 51206dd2-d56f-131b-7c0b-139785889449.dc1 127.0.0.1
TestHealthServiceChecks - 2019/12/30 18:54:36.419181 [INFO] serf: EventMemberJoin: Node 51206dd2-d56f-131b-7c0b-139785889449 127.0.0.1
TestHealthServiceChecks - 2019/12/30 18:54:36.421122 [INFO] consul: Adding LAN server Node 51206dd2-d56f-131b-7c0b-139785889449 (Addr: tcp/127.0.0.1:18112) (DC: dc1)
TestHealthServiceChecks - 2019/12/30 18:54:36.421938 [INFO] consul: Handled member-join event for server "Node 51206dd2-d56f-131b-7c0b-139785889449.dc1" in area "wan"
TestHealthServiceChecks - 2019/12/30 18:54:36.425973 [INFO] agent: Started DNS server 127.0.0.1:18107 (tcp)
TestHealthServiceChecks - 2019/12/30 18:54:36.426191 [INFO] agent: Started DNS server 127.0.0.1:18107 (udp)
TestHealthServiceChecks - 2019/12/30 18:54:36.433887 [INFO] agent: Started HTTP server on 127.0.0.1:18108 (tcp)
TestHealthServiceChecks - 2019/12/30 18:54:36.434021 [INFO] agent: started state syncer
2019/12/30 18:54:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:36 [INFO]  raft: Node at 127.0.0.1:18112 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:36.518292 [INFO] agent: Synced node info
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:36.518419 [DEBUG] agent: Node info in sync
2019/12/30 18:54:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:37 [INFO]  raft: Node at 127.0.0.1:18112 [Leader] entering Leader state
TestHealthServiceChecks - 2019/12/30 18:54:37.236255 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks - 2019/12/30 18:54:37.236843 [INFO] consul: New leader elected: Node 51206dd2-d56f-131b-7c0b-139785889449
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:37.449564 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:37.450021 [DEBUG] consul: Skipping self join check for "Node a51024dc-44d4-72c9-ef78-3e916be03051" since the cluster is too small
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:37.450175 [INFO] consul: member 'Node a51024dc-44d4-72c9-ef78-3e916be03051' joined, marking health alive
TestHealthServiceChecks - 2019/12/30 18:54:37.899917 [INFO] agent: Synced node info
TestHealthServiceChecks - 2019/12/30 18:54:37.900041 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.049615 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.050049 [DEBUG] consul: Skipping self join check for "Node 956903df-d93e-17e6-5cd4-715fca68f962" since the cluster is too small
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.050201 [INFO] consul: member 'Node 956903df-d93e-17e6-5cd4-715fca68f962' joined, marking health alive
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.295020 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.295150 [INFO] consul: shutting down server
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.295201 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.367578 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.430004 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.473447 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.473541 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.490708 [INFO] manager: shutting down
TestHealthServiceChecks - 2019/12/30 18:54:38.595745 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.674235 [WARN] agent: Deregistering check "consul check" failed. leadership lost while committing log
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.674358 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.674659 [INFO] agent: consul server down
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.674724 [INFO] agent: shutdown complete
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.674788 [INFO] agent: Stopping DNS server 127.0.0.1:18095 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.674936 [INFO] agent: Stopping DNS server 127.0.0.1:18095 (udp)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.675109 [INFO] agent: Stopping HTTP server 127.0.0.1:18096 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.675329 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_Filtering - 2019/12/30 18:54:38.675412 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_Filtering (5.07s)
=== CONT  TestDNS_trimUDPResponse_NoTrim
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.684312 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.685035 [INFO] consul: shutting down server
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.685218 [WARN] serf: Shutdown without a Leave
--- PASS: TestDNS_trimUDPResponse_NoTrim (0.05s)
=== CONT  TestHealthNodeChecks_Filtering
WARNING: bootstrap = true: do not enable unless necessary
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:38.803204 [WARN] agent: Node name "Node 500cf44a-7a74-4410-eef7-2f4de6986be4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:38.803736 [DEBUG] tlsutil: Update with version 1
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:38.806295 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.823901 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.973960 [INFO] manager: shutting down
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.974524 [INFO] agent: consul server down
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.974585 [INFO] agent: shutdown complete
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.974642 [INFO] agent: Stopping DNS server 127.0.0.1:18101 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.974780 [INFO] agent: Stopping DNS server 127.0.0.1:18101 (udp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.974943 [INFO] agent: Stopping HTTP server 127.0.0.1:18102 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.975188 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_NodeMetaFilter - 2019/12/30 18:54:38.975264 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_NodeMetaFilter (5.02s)
=== CONT  TestAgent_GetCoordinate
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_GetCoordinate - 2019/12/30 18:54:39.038250 [WARN] agent: Node name "Node 033a1a71-e9b1-b8d9-d7fa-bfa03f579482" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_GetCoordinate - 2019/12/30 18:54:39.038690 [DEBUG] tlsutil: Update with version 1
TestAgent_GetCoordinate - 2019/12/30 18:54:39.041012 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:54:39.231838 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:39.231929 [DEBUG] agent: Node info in sync
TestHealthServiceChecks - 2019/12/30 18:54:39.249916 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks - 2019/12/30 18:54:39.250347 [DEBUG] consul: Skipping self join check for "Node 51206dd2-d56f-131b-7c0b-139785889449" since the cluster is too small
TestHealthServiceChecks - 2019/12/30 18:54:39.250506 [INFO] consul: member 'Node 51206dd2-d56f-131b-7c0b-139785889449' joined, marking health alive
2019/12/30 18:54:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:500cf44a-7a74-4410-eef7-2f4de6986be4 Address:127.0.0.1:18118}]
TestHealthServiceChecks - 2019/12/30 18:54:39.980451 [INFO] agent: Requesting shutdown
TestHealthServiceChecks - 2019/12/30 18:54:39.980544 [INFO] consul: shutting down server
TestHealthServiceChecks - 2019/12/30 18:54:39.980594 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.981347 [INFO] serf: EventMemberJoin: Node 500cf44a-7a74-4410-eef7-2f4de6986be4.dc1 127.0.0.1
2019/12/30 18:54:39 [INFO]  raft: Node at 127.0.0.1:18118 [Follower] entering Follower state (Leader: "")
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.985492 [INFO] serf: EventMemberJoin: Node 500cf44a-7a74-4410-eef7-2f4de6986be4 127.0.0.1
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.986275 [INFO] consul: Adding LAN server Node 500cf44a-7a74-4410-eef7-2f4de6986be4 (Addr: tcp/127.0.0.1:18118) (DC: dc1)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.986580 [INFO] consul: Handled member-join event for server "Node 500cf44a-7a74-4410-eef7-2f4de6986be4.dc1" in area "wan"
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.987744 [INFO] agent: Started DNS server 127.0.0.1:18113 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.988238 [INFO] agent: Started DNS server 127.0.0.1:18113 (udp)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.990643 [INFO] agent: Started HTTP server on 127.0.0.1:18114 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:39.990766 [INFO] agent: started state syncer
2019/12/30 18:54:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:40 [INFO]  raft: Node at 127.0.0.1:18118 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks - 2019/12/30 18:54:40.066377 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:54:40.127591 [DEBUG] manager: Rebalanced 1 servers, next active server is Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786.dc1 (Addr: tcp/127.0.0.1:17692) (DC: dc1)
2019/12/30 18:54:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:033a1a71-e9b1-b8d9-d7fa-bfa03f579482 Address:127.0.0.1:18124}]
TestHealthServiceChecks - 2019/12/30 18:54:40.144681 [INFO] manager: shutting down
TestHealthServiceChecks - 2019/12/30 18:54:40.145117 [INFO] agent: consul server down
TestHealthServiceChecks - 2019/12/30 18:54:40.145171 [INFO] agent: shutdown complete
TestHealthServiceChecks - 2019/12/30 18:54:40.145226 [INFO] agent: Stopping DNS server 127.0.0.1:18107 (tcp)
TestHealthServiceChecks - 2019/12/30 18:54:40.145368 [INFO] agent: Stopping DNS server 127.0.0.1:18107 (udp)
TestHealthServiceChecks - 2019/12/30 18:54:40.145533 [INFO] agent: Stopping HTTP server 127.0.0.1:18108 (tcp)
TestHealthServiceChecks - 2019/12/30 18:54:40.145739 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks - 2019/12/30 18:54:40.145812 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks (4.87s)
=== CONT  TestHealthNodeChecks
2019/12/30 18:54:40 [INFO]  raft: Node at 127.0.0.1:18124 [Follower] entering Follower state (Leader: "")
TestAgent_GetCoordinate - 2019/12/30 18:54:40.155602 [INFO] serf: EventMemberJoin: Node 033a1a71-e9b1-b8d9-d7fa-bfa03f579482.dc1 127.0.0.1
TestAgent_GetCoordinate - 2019/12/30 18:54:40.168951 [INFO] serf: EventMemberJoin: Node 033a1a71-e9b1-b8d9-d7fa-bfa03f579482 127.0.0.1
TestAgent_GetCoordinate - 2019/12/30 18:54:40.171375 [INFO] consul: Adding LAN server Node 033a1a71-e9b1-b8d9-d7fa-bfa03f579482 (Addr: tcp/127.0.0.1:18124) (DC: dc1)
TestAgent_GetCoordinate - 2019/12/30 18:54:40.174188 [INFO] consul: Handled member-join event for server "Node 033a1a71-e9b1-b8d9-d7fa-bfa03f579482.dc1" in area "wan"
TestAgent_GetCoordinate - 2019/12/30 18:54:40.175870 [INFO] agent: Started DNS server 127.0.0.1:18119 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:40.176288 [INFO] agent: Started DNS server 127.0.0.1:18119 (udp)
TestAgent_GetCoordinate - 2019/12/30 18:54:40.178707 [INFO] agent: Started HTTP server on 127.0.0.1:18120 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:40.178805 [INFO] agent: started state syncer
2019/12/30 18:54:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:40 [INFO]  raft: Node at 127.0.0.1:18124 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestHealthNodeChecks - 2019/12/30 18:54:40.241122 [WARN] agent: Node name "Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthNodeChecks - 2019/12/30 18:54:40.241681 [DEBUG] tlsutil: Update with version 1
TestHealthNodeChecks - 2019/12/30 18:54:40.244474 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:40 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:40 [INFO]  raft: Node at 127.0.0.1:18118 [Leader] entering Leader state
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:40.692457 [INFO] consul: cluster leadership acquired
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:40.692907 [INFO] consul: New leader elected: Node 500cf44a-7a74-4410-eef7-2f4de6986be4
2019/12/30 18:54:40 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:40 [INFO]  raft: Node at 127.0.0.1:18124 [Leader] entering Leader state
TestAgent_GetCoordinate - 2019/12/30 18:54:40.826374 [INFO] consul: cluster leadership acquired
TestAgent_GetCoordinate - 2019/12/30 18:54:40.826800 [INFO] consul: New leader elected: Node 033a1a71-e9b1-b8d9-d7fa-bfa03f579482
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:40.944156 [INFO] agent: Requesting shutdown
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:40.944286 [INFO] consul: shutting down server
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:40.944338 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:54:41.032866 [DEBUG] consul: Skipping self join check for "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" since the cluster is too small
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.035028 [WARN] serf: Shutdown without a Leave
TestAgent_GetCoordinate - 2019/12/30 18:54:41.049658 [INFO] agent: Requesting shutdown
TestAgent_GetCoordinate - 2019/12/30 18:54:41.049793 [INFO] consul: shutting down server
TestAgent_GetCoordinate - 2019/12/30 18:54:41.049862 [WARN] serf: Shutdown without a Leave
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.140872 [INFO] manager: shutting down
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.141928 [INFO] agent: consul server down
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.142001 [INFO] agent: shutdown complete
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.142063 [INFO] agent: Stopping DNS server 127.0.0.1:17969 (tcp)
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.142253 [INFO] agent: Stopping DNS server 127.0.0.1:17969 (udp)
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.142447 [INFO] agent: Stopping HTTP server 127.0.0.1:17970 (tcp)
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.142683 [INFO] agent: Waiting for endpoints to shut down
TestPProfHandlers_EnableDebug - 2019/12/30 18:54:41.142766 [INFO] agent: Endpoints down
--- PASS: TestPProfHandlers_EnableDebug (32.53s)
=== CONT  TestHealthChecksInState_DistanceSort
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:41.146610 [INFO] agent: Synced node info
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:41.146747 [DEBUG] agent: Node info in sync
TestAgent_GetCoordinate - 2019/12/30 18:54:41.150130 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:41.275780 [WARN] agent: Node name "Node 86717d61-3e9b-0a54-a279-b3ba729e90e1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:41.277133 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:41.281379 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_GetCoordinate - 2019/12/30 18:54:41.332510 [INFO] manager: shutting down
TestAgent_GetCoordinate - 2019/12/30 18:54:41.332726 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestAgent_GetCoordinate - 2019/12/30 18:54:41.333106 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_GetCoordinate - 2019/12/30 18:54:41.333322 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_GetCoordinate - 2019/12/30 18:54:41.441089 [INFO] agent: consul server down
TestAgent_GetCoordinate - 2019/12/30 18:54:41.441171 [INFO] agent: shutdown complete
2019/12/30 18:54:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e88e2a3f-6f94-cfdf-091a-8569236dbc4e Address:127.0.0.1:18130}]
2019/12/30 18:54:41 [INFO]  raft: Node at 127.0.0.1:18130 [Follower] entering Follower state (Leader: "")
TestAgent_GetCoordinate - 2019/12/30 18:54:41.441231 [INFO] agent: Stopping DNS server 127.0.0.1:18119 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:41.441860 [INFO] agent: Stopping DNS server 127.0.0.1:18119 (udp)
TestAgent_GetCoordinate - 2019/12/30 18:54:41.442072 [INFO] agent: Stopping HTTP server 127.0.0.1:18120 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:41.442316 [INFO] agent: Waiting for endpoints to shut down
TestAgent_GetCoordinate - 2019/12/30 18:54:41.442404 [INFO] agent: Endpoints down
TestAgent_GetCoordinate - 2019/12/30 18:54:41.441327 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_GetCoordinate - 2019/12/30 18:54:41.445987 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestHealthNodeChecks - 2019/12/30 18:54:41.446676 [INFO] serf: EventMemberJoin: Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e.dc1 127.0.0.1
TestHealthNodeChecks - 2019/12/30 18:54:41.450795 [INFO] serf: EventMemberJoin: Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e 127.0.0.1
TestHealthNodeChecks - 2019/12/30 18:54:41.451691 [INFO] consul: Handled member-join event for server "Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e.dc1" in area "wan"
TestHealthNodeChecks - 2019/12/30 18:54:41.452005 [INFO] consul: Adding LAN server Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e (Addr: tcp/127.0.0.1:18130) (DC: dc1)
TestHealthNodeChecks - 2019/12/30 18:54:41.452605 [INFO] agent: Started DNS server 127.0.0.1:18125 (tcp)
TestHealthNodeChecks - 2019/12/30 18:54:41.453087 [INFO] agent: Started DNS server 127.0.0.1:18125 (udp)
TestHealthNodeChecks - 2019/12/30 18:54:41.467580 [INFO] agent: Started HTTP server on 127.0.0.1:18126 (tcp)
TestHealthNodeChecks - 2019/12/30 18:54:41.467692 [INFO] agent: started state syncer
2019/12/30 18:54:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:41 [INFO]  raft: Node at 127.0.0.1:18130 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_GetCoordinate - 2019/12/30 18:54:41.527092 [WARN] agent: Node name "Node d7154d4b-2eb3-0301-c5fb-03df008d12f2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_GetCoordinate - 2019/12/30 18:54:41.527749 [DEBUG] tlsutil: Update with version 1
TestAgent_GetCoordinate - 2019/12/30 18:54:41.530239 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:42.071084 [DEBUG] agent: Node info in sync
2019/12/30 18:54:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:18130 [Leader] entering Leader state
TestHealthNodeChecks - 2019/12/30 18:54:42.182831 [INFO] consul: cluster leadership acquired
TestHealthNodeChecks - 2019/12/30 18:54:42.183362 [INFO] consul: New leader elected: Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e
jones - 2019/12/30 18:54:42.409131 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 90e88a15-5862-4de0-2f1f-c638261bac76.dc1 (Addr: tcp/127.0.0.1:17698) (DC: dc1)
2019/12/30 18:54:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:86717d61-3e9b-0a54-a279-b3ba729e90e1 Address:127.0.0.1:18136}]
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.580134 [INFO] serf: EventMemberJoin: Node 86717d61-3e9b-0a54-a279-b3ba729e90e1.dc1 127.0.0.1
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.584056 [INFO] serf: EventMemberJoin: Node 86717d61-3e9b-0a54-a279-b3ba729e90e1 127.0.0.1
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:18136 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.587462 [INFO] consul: Adding LAN server Node 86717d61-3e9b-0a54-a279-b3ba729e90e1 (Addr: tcp/127.0.0.1:18136) (DC: dc1)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.588035 [INFO] consul: Handled member-join event for server "Node 86717d61-3e9b-0a54-a279-b3ba729e90e1.dc1" in area "wan"
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.588475 [INFO] agent: Started DNS server 127.0.0.1:18131 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.590474 [INFO] agent: Started DNS server 127.0.0.1:18131 (udp)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.593103 [INFO] agent: Started HTTP server on 127.0.0.1:18132 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:42.593202 [INFO] agent: started state syncer
2019/12/30 18:54:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:18136 [Candidate] entering Candidate state in term 2
TestHealthNodeChecks - 2019/12/30 18:54:42.669575 [INFO] agent: Synced node info
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:42.758220 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:42.758745 [DEBUG] consul: Skipping self join check for "Node 500cf44a-7a74-4410-eef7-2f4de6986be4" since the cluster is too small
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:42.758941 [INFO] consul: member 'Node 500cf44a-7a74-4410-eef7-2f4de6986be4' joined, marking health alive
2019/12/30 18:54:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d7154d4b-2eb3-0301-c5fb-03df008d12f2 Address:127.0.0.1:18142}]
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:18142 [Follower] entering Follower state (Leader: "")
TestAgent_GetCoordinate - 2019/12/30 18:54:42.773909 [INFO] serf: EventMemberJoin: Node d7154d4b-2eb3-0301-c5fb-03df008d12f2.dc1 127.0.0.1
TestAgent_GetCoordinate - 2019/12/30 18:54:42.779917 [INFO] serf: EventMemberJoin: Node d7154d4b-2eb3-0301-c5fb-03df008d12f2 127.0.0.1
TestAgent_GetCoordinate - 2019/12/30 18:54:42.781795 [INFO] consul: Adding LAN server Node d7154d4b-2eb3-0301-c5fb-03df008d12f2 (Addr: tcp/127.0.0.1:18142) (DC: dc1)
TestAgent_GetCoordinate - 2019/12/30 18:54:42.782101 [INFO] consul: Handled member-join event for server "Node d7154d4b-2eb3-0301-c5fb-03df008d12f2.dc1" in area "wan"
TestAgent_GetCoordinate - 2019/12/30 18:54:42.782223 [INFO] agent: Started DNS server 127.0.0.1:18137 (udp)
TestAgent_GetCoordinate - 2019/12/30 18:54:42.782607 [INFO] agent: Started DNS server 127.0.0.1:18137 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:42.786128 [INFO] agent: Started HTTP server on 127.0.0.1:18138 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:42.786243 [INFO] agent: started state syncer
2019/12/30 18:54:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:18142 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:54:43.243106 [DEBUG] consul: Skipping self join check for "Node 90e88a15-5862-4de0-2f1f-c638261bac76" since the cluster is too small
2019/12/30 18:54:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:43 [INFO]  raft: Node at 127.0.0.1:18136 [Leader] entering Leader state
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:43.369739 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:43.370205 [INFO] consul: New leader elected: Node 86717d61-3e9b-0a54-a279-b3ba729e90e1
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.535807 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.538130 [INFO] agent: Requesting shutdown
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.538248 [INFO] consul: shutting down server
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.538311 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.542261 [WARN] consul: error getting server health from "Node 500cf44a-7a74-4410-eef7-2f4de6986be4": rpc error making call: EOF
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.616678 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:43 [INFO]  raft: Node at 127.0.0.1:18142 [Leader] entering Leader state
TestAgent_GetCoordinate - 2019/12/30 18:54:43.617884 [INFO] consul: cluster leadership acquired
TestAgent_GetCoordinate - 2019/12/30 18:54:43.618282 [INFO] consul: New leader elected: Node d7154d4b-2eb3-0301-c5fb-03df008d12f2
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.701025 [INFO] manager: shutting down
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.701562 [INFO] agent: consul server down
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.701630 [INFO] agent: shutdown complete
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.701690 [INFO] agent: Stopping DNS server 127.0.0.1:18113 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.701860 [INFO] agent: Stopping DNS server 127.0.0.1:18113 (udp)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.702028 [INFO] agent: Stopping HTTP server 127.0.0.1:18114 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.702246 [INFO] agent: Waiting for endpoints to shut down
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:43.702327 [INFO] agent: Endpoints down
--- PASS: TestHealthNodeChecks_Filtering (4.97s)
=== CONT  TestHealthChecksInState_Filter
TestHealthNodeChecks - 2019/12/30 18:54:43.783207 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthNodeChecks - 2019/12/30 18:54:43.783680 [DEBUG] consul: Skipping self join check for "Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e" since the cluster is too small
TestHealthNodeChecks - 2019/12/30 18:54:43.783862 [INFO] consul: member 'Node e88e2a3f-6f94-cfdf-091a-8569236dbc4e' joined, marking health alive
TestHealthNodeChecks - 2019/12/30 18:54:43.802776 [DEBUG] agent: Node info in sync
TestHealthNodeChecks - 2019/12/30 18:54:43.802887 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_GetCoordinate - 2019/12/30 18:54:43.853118 [INFO] agent: Requesting shutdown
TestAgent_GetCoordinate - 2019/12/30 18:54:43.853237 [INFO] consul: shutting down server
TestAgent_GetCoordinate - 2019/12/30 18:54:43.853292 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_Filter - 2019/12/30 18:54:43.853885 [WARN] agent: Node name "Node d0258369-45a2-5541-1208-0a4e619b2c33" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_Filter - 2019/12/30 18:54:43.854588 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_Filter - 2019/12/30 18:54:43.856975 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_GetCoordinate - 2019/12/30 18:54:43.943458 [WARN] serf: Shutdown without a Leave
TestAgent_GetCoordinate - 2019/12/30 18:54:43.943803 [INFO] agent: Synced node info
TestAgent_GetCoordinate - 2019/12/30 18:54:43.943909 [DEBUG] agent: Node info in sync
TestHealthNodeChecks - 2019/12/30 18:54:43.952095 [INFO] agent: Requesting shutdown
TestHealthNodeChecks - 2019/12/30 18:54:43.952194 [INFO] consul: shutting down server
TestHealthNodeChecks - 2019/12/30 18:54:43.952260 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks - 2019/12/30 18:54:44.024053 [WARN] serf: Shutdown without a Leave
TestAgent_GetCoordinate - 2019/12/30 18:54:44.025289 [INFO] manager: shutting down
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.028425 [INFO] agent: Synced node info
jones - 2019/12/30 18:54:44.077958 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6.dc1 (Addr: tcp/127.0.0.1:17704) (DC: dc1)
TestHealthNodeChecks - 2019/12/30 18:54:44.107551 [INFO] manager: shutting down
TestAgent_GetCoordinate - 2019/12/30 18:54:44.107689 [INFO] agent: consul server down
TestAgent_GetCoordinate - 2019/12/30 18:54:44.107745 [INFO] agent: shutdown complete
TestAgent_GetCoordinate - 2019/12/30 18:54:44.107803 [INFO] agent: Stopping DNS server 127.0.0.1:18137 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:44.107969 [INFO] agent: Stopping DNS server 127.0.0.1:18137 (udp)
TestHealthNodeChecks - 2019/12/30 18:54:44.108034 [INFO] agent: consul server down
TestHealthNodeChecks - 2019/12/30 18:54:44.108079 [INFO] agent: shutdown complete
TestHealthNodeChecks - 2019/12/30 18:54:44.108170 [INFO] agent: Stopping DNS server 127.0.0.1:18125 (tcp)
TestHealthNodeChecks - 2019/12/30 18:54:44.108348 [INFO] agent: Stopping DNS server 127.0.0.1:18125 (udp)
TestAgent_GetCoordinate - 2019/12/30 18:54:44.108173 [INFO] agent: Stopping HTTP server 127.0.0.1:18138 (tcp)
TestHealthNodeChecks - 2019/12/30 18:54:44.108512 [INFO] agent: Stopping HTTP server 127.0.0.1:18126 (tcp)
TestAgent_GetCoordinate - 2019/12/30 18:54:44.108693 [INFO] agent: Waiting for endpoints to shut down
TestHealthNodeChecks - 2019/12/30 18:54:44.108723 [INFO] agent: Waiting for endpoints to shut down
TestAgent_GetCoordinate - 2019/12/30 18:54:44.108768 [INFO] agent: Endpoints down
TestHealthNodeChecks - 2019/12/30 18:54:44.108781 [INFO] agent: Endpoints down
--- PASS: TestAgent_GetCoordinate (5.13s)
--- PASS: TestHealthNodeChecks (3.96s)
TestAgent_GetCoordinate - 2019/12/30 18:54:44.108228 [ERR] autopilot: failed to initialize config: leadership lost while committing log
=== CONT  TestUUIDToUint64
--- PASS: TestUUIDToUint64 (0.00s)
=== CONT  TestEventList_EventBufOrder
TestAgent_GetCoordinate - 2019/12/30 18:54:44.109202 [ERR] consul: failed to establish leadership: raft is already shutdown
=== CONT  TestHealthChecksInState_NodeMetaFilter
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_EventBufOrder - 2019/12/30 18:54:44.232298 [WARN] agent: Node name "Node 511d43aa-e235-e7ba-ff9b-4177271119cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_EventBufOrder - 2019/12/30 18:54:44.232712 [DEBUG] tlsutil: Update with version 1
TestEventList_EventBufOrder - 2019/12/30 18:54:44.240461 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:44.390422 [WARN] agent: Node name "Node 79653281-7bf3-dafb-87f5-f773cfa88682" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:44.392995 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:44.395590 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks_Filtering - 2019/12/30 18:54:44.535928 [WARN] consul: error getting server health from "Node 500cf44a-7a74-4410-eef7-2f4de6986be4": context deadline exceeded
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.668019 [INFO] agent: Requesting shutdown
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.668124 [INFO] consul: shutting down server
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.668174 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.749080 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.832506 [INFO] manager: shutting down
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.833551 [INFO] agent: consul server down
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.833623 [INFO] agent: shutdown complete
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.833694 [INFO] agent: Stopping DNS server 127.0.0.1:18131 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.833857 [INFO] agent: Stopping DNS server 127.0.0.1:18131 (udp)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.834014 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.834059 [INFO] agent: Stopping HTTP server 127.0.0.1:18132 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.834217 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.834276 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_DistanceSort - 2019/12/30 18:54:44.834342 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_DistanceSort (3.69s)
=== CONT  TestEventList_Blocking
2019/12/30 18:54:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d0258369-45a2-5541-1208-0a4e619b2c33 Address:127.0.0.1:18148}]
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.919785 [INFO] serf: EventMemberJoin: Node d0258369-45a2-5541-1208-0a4e619b2c33.dc1 127.0.0.1
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.922832 [INFO] serf: EventMemberJoin: Node d0258369-45a2-5541-1208-0a4e619b2c33 127.0.0.1
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.924068 [INFO] agent: Started DNS server 127.0.0.1:18143 (udp)
2019/12/30 18:54:44 [INFO]  raft: Node at 127.0.0.1:18148 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.929095 [INFO] consul: Handled member-join event for server "Node d0258369-45a2-5541-1208-0a4e619b2c33.dc1" in area "wan"
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.929564 [INFO] consul: Adding LAN server Node d0258369-45a2-5541-1208-0a4e619b2c33 (Addr: tcp/127.0.0.1:18148) (DC: dc1)
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.930501 [INFO] agent: Started DNS server 127.0.0.1:18143 (tcp)
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.932756 [INFO] agent: Started HTTP server on 127.0.0.1:18144 (tcp)
TestHealthChecksInState_Filter - 2019/12/30 18:54:44.932873 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_Blocking - 2019/12/30 18:54:44.946610 [WARN] agent: Node name "Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_Blocking - 2019/12/30 18:54:44.947014 [DEBUG] tlsutil: Update with version 1
TestEventList_Blocking - 2019/12/30 18:54:44.949144 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:44 [INFO]  raft: Node at 127.0.0.1:18148 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:54:45.259634 [DEBUG] consul: Skipping self join check for "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" since the cluster is too small
2019/12/30 18:54:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:79653281-7bf3-dafb-87f5-f773cfa88682 Address:127.0.0.1:18160}]
2019/12/30 18:54:45 [INFO]  raft: Node at 127.0.0.1:18160 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.382052 [INFO] serf: EventMemberJoin: Node 79653281-7bf3-dafb-87f5-f773cfa88682.dc1 127.0.0.1
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.385524 [INFO] serf: EventMemberJoin: Node 79653281-7bf3-dafb-87f5-f773cfa88682 127.0.0.1
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.386328 [INFO] consul: Handled member-join event for server "Node 79653281-7bf3-dafb-87f5-f773cfa88682.dc1" in area "wan"
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.386475 [INFO] consul: Adding LAN server Node 79653281-7bf3-dafb-87f5-f773cfa88682 (Addr: tcp/127.0.0.1:18160) (DC: dc1)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.386975 [INFO] agent: Started DNS server 127.0.0.1:18155 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.387050 [INFO] agent: Started DNS server 127.0.0.1:18155 (udp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.389523 [INFO] agent: Started HTTP server on 127.0.0.1:18156 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:45.389625 [INFO] agent: started state syncer
2019/12/30 18:54:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:45 [INFO]  raft: Node at 127.0.0.1:18160 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:511d43aa-e235-e7ba-ff9b-4177271119cf Address:127.0.0.1:18154}]
2019/12/30 18:54:45 [INFO]  raft: Node at 127.0.0.1:18154 [Follower] entering Follower state (Leader: "")
TestEventList_EventBufOrder - 2019/12/30 18:54:45.578640 [INFO] serf: EventMemberJoin: Node 511d43aa-e235-e7ba-ff9b-4177271119cf.dc1 127.0.0.1
TestEventList_EventBufOrder - 2019/12/30 18:54:45.588908 [INFO] serf: EventMemberJoin: Node 511d43aa-e235-e7ba-ff9b-4177271119cf 127.0.0.1
TestEventList_EventBufOrder - 2019/12/30 18:54:45.590772 [INFO] consul: Adding LAN server Node 511d43aa-e235-e7ba-ff9b-4177271119cf (Addr: tcp/127.0.0.1:18154) (DC: dc1)
TestEventList_EventBufOrder - 2019/12/30 18:54:45.591567 [INFO] consul: Handled member-join event for server "Node 511d43aa-e235-e7ba-ff9b-4177271119cf.dc1" in area "wan"
TestEventList_EventBufOrder - 2019/12/30 18:54:45.593451 [INFO] agent: Started DNS server 127.0.0.1:18149 (tcp)
TestEventList_EventBufOrder - 2019/12/30 18:54:45.593979 [INFO] agent: Started DNS server 127.0.0.1:18149 (udp)
TestEventList_EventBufOrder - 2019/12/30 18:54:45.596928 [INFO] agent: Started HTTP server on 127.0.0.1:18150 (tcp)
TestEventList_EventBufOrder - 2019/12/30 18:54:45.597050 [INFO] agent: started state syncer
2019/12/30 18:54:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:45 [INFO]  raft: Node at 127.0.0.1:18154 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:45 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:45 [INFO]  raft: Node at 127.0.0.1:18148 [Leader] entering Leader state
TestHealthChecksInState_Filter - 2019/12/30 18:54:45.909892 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_Filter - 2019/12/30 18:54:45.910443 [INFO] consul: New leader elected: Node d0258369-45a2-5541-1208-0a4e619b2c33
2019/12/30 18:54:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:46 [INFO]  raft: Node at 127.0.0.1:18160 [Leader] entering Leader state
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:46.267908 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:46.268829 [INFO] consul: New leader elected: Node 79653281-7bf3-dafb-87f5-f773cfa88682
2019/12/30 18:54:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:46 [INFO]  raft: Node at 127.0.0.1:18154 [Leader] entering Leader state
2019/12/30 18:54:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cd8824b2-ab1d-f405-a4ec-a4fa806b2103 Address:127.0.0.1:18166}]
TestEventList_EventBufOrder - 2019/12/30 18:54:46.376895 [INFO] consul: cluster leadership acquired
TestEventList_EventBufOrder - 2019/12/30 18:54:46.377546 [INFO] consul: New leader elected: Node 511d43aa-e235-e7ba-ff9b-4177271119cf
TestEventList_Blocking - 2019/12/30 18:54:46.379620 [INFO] serf: EventMemberJoin: Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103.dc1 127.0.0.1
TestHealthChecksInState_Filter - 2019/12/30 18:54:46.380532 [INFO] agent: Synced node info
TestHealthChecksInState_Filter - 2019/12/30 18:54:46.380815 [DEBUG] agent: Node info in sync
TestEventList_Blocking - 2019/12/30 18:54:46.384006 [INFO] serf: EventMemberJoin: Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103 127.0.0.1
2019/12/30 18:54:46 [INFO]  raft: Node at 127.0.0.1:18166 [Follower] entering Follower state (Leader: "")
TestEventList_Blocking - 2019/12/30 18:54:46.389303 [INFO] consul: Adding LAN server Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103 (Addr: tcp/127.0.0.1:18166) (DC: dc1)
TestEventList_Blocking - 2019/12/30 18:54:46.389745 [INFO] agent: Started DNS server 127.0.0.1:18161 (udp)
TestEventList_Blocking - 2019/12/30 18:54:46.390087 [INFO] consul: Handled member-join event for server "Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103.dc1" in area "wan"
TestEventList_Blocking - 2019/12/30 18:54:46.390823 [INFO] agent: Started DNS server 127.0.0.1:18161 (tcp)
jones - 2019/12/30 18:54:46.437405 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef.dc1 (Addr: tcp/127.0.0.1:17710) (DC: dc1)
TestEventList_Blocking - 2019/12/30 18:54:46.441863 [INFO] agent: Started HTTP server on 127.0.0.1:18162 (tcp)
TestEventList_Blocking - 2019/12/30 18:54:46.451281 [INFO] agent: started state syncer
2019/12/30 18:54:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:46 [INFO]  raft: Node at 127.0.0.1:18166 [Candidate] entering Candidate state in term 2
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.121879 [INFO] agent: Synced node info
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.124788 [INFO] agent: Requesting shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.124882 [INFO] consul: shutting down server
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.124936 [WARN] serf: Shutdown without a Leave
TestEventList_EventBufOrder - 2019/12/30 18:54:47.275395 [INFO] agent: Synced node info
TestEventList_EventBufOrder - 2019/12/30 18:54:47.275527 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:54:47.278361 [DEBUG] consul: Skipping self join check for "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" since the cluster is too small
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.279235 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.391796 [INFO] manager: shutting down
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.396246 [INFO] agent: Requesting shutdown
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.396351 [INFO] consul: shutting down server
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.396404 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.507649 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:47 [INFO]  raft: Node at 127.0.0.1:18166 [Leader] entering Leader state
TestEventList_Blocking - 2019/12/30 18:54:47.512794 [INFO] consul: cluster leadership acquired
TestEventList_Blocking - 2019/12/30 18:54:47.513500 [INFO] consul: New leader elected: Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.514646 [INFO] agent: consul server down
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.514749 [INFO] agent: shutdown complete
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.514830 [INFO] agent: Stopping DNS server 127.0.0.1:18155 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515032 [INFO] agent: Stopping DNS server 127.0.0.1:18155 (udp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515241 [INFO] agent: Stopping HTTP server 127.0.0.1:18156 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515399 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515513 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515603 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_NodeMetaFilter (3.41s)
=== CONT  TestEventList_ACLFilter
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515795 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/12/30 18:54:47.515991 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_ACLFilter - 2019/12/30 18:54:47.581729 [WARN] agent: Node name "Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_ACLFilter - 2019/12/30 18:54:47.582189 [DEBUG] tlsutil: Update with version 1
TestEventList_ACLFilter - 2019/12/30 18:54:47.584583 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.632623 [INFO] manager: shutting down
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.692455 [INFO] agent: consul server down
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.692548 [INFO] agent: shutdown complete
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.692620 [INFO] agent: Stopping DNS server 127.0.0.1:18143 (tcp)
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.692804 [INFO] agent: Stopping DNS server 127.0.0.1:18143 (udp)
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.692993 [INFO] agent: Stopping HTTP server 127.0.0.1:18144 (tcp)
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.693224 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.693300 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_Filter (3.99s)
=== CONT  TestEventList_Filter
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.724138 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestHealthChecksInState_Filter - 2019/12/30 18:54:47.724484 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_Filter - 2019/12/30 18:54:47.788992 [WARN] agent: Node name "Node f3be110f-c9d4-535b-320a-738c68af895e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_Filter - 2019/12/30 18:54:47.789483 [DEBUG] tlsutil: Update with version 1
TestEventList_Filter - 2019/12/30 18:54:47.791687 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_Blocking - 2019/12/30 18:54:48.380506 [INFO] agent: Synced node info
TestEventList_Blocking - 2019/12/30 18:54:48.380643 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:54:48.501137 [DEBUG] manager: Rebalanced 1 servers, next active server is Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d.dc1 (Addr: tcp/127.0.0.1:17716) (DC: dc1)
TestEventList_EventBufOrder - 2019/12/30 18:54:48.812742 [DEBUG] agent: Node info in sync
TestEventList_Blocking - 2019/12/30 18:54:48.850617 [DEBUG] agent: Node info in sync
TestEventList_EventBufOrder - 2019/12/30 18:54:48.950188 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_EventBufOrder - 2019/12/30 18:54:48.950701 [DEBUG] consul: Skipping self join check for "Node 511d43aa-e235-e7ba-ff9b-4177271119cf" since the cluster is too small
TestEventList_EventBufOrder - 2019/12/30 18:54:48.950905 [INFO] consul: member 'Node 511d43aa-e235-e7ba-ff9b-4177271119cf' joined, marking health alive
jones - 2019/12/30 18:54:48.956870 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:54:48.956950 [DEBUG] agent: Node info in sync
2019/12/30 18:54:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:72f69057-a23d-ff1b-4f07-c7769a47ebb4 Address:127.0.0.1:18172}]
2019/12/30 18:54:49 [INFO]  raft: Node at 127.0.0.1:18172 [Follower] entering Follower state (Leader: "")
TestEventList_ACLFilter - 2019/12/30 18:54:49.037275 [INFO] serf: EventMemberJoin: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4.dc1 127.0.0.1
TestEventList_ACLFilter - 2019/12/30 18:54:49.041138 [INFO] serf: EventMemberJoin: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4 127.0.0.1
TestEventList_ACLFilter - 2019/12/30 18:54:49.041930 [INFO] consul: Adding LAN server Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4 (Addr: tcp/127.0.0.1:18172) (DC: dc1)
TestEventList_ACLFilter - 2019/12/30 18:54:49.041972 [INFO] consul: Handled member-join event for server "Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4.dc1" in area "wan"
TestEventList_ACLFilter - 2019/12/30 18:54:49.042632 [INFO] agent: Started DNS server 127.0.0.1:18167 (udp)
TestEventList_ACLFilter - 2019/12/30 18:54:49.042717 [INFO] agent: Started DNS server 127.0.0.1:18167 (tcp)
TestEventList_ACLFilter - 2019/12/30 18:54:49.045198 [INFO] agent: Started HTTP server on 127.0.0.1:18168 (tcp)
TestEventList_ACLFilter - 2019/12/30 18:54:49.045309 [INFO] agent: started state syncer
2019/12/30 18:54:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:49 [INFO]  raft: Node at 127.0.0.1:18172 [Candidate] entering Candidate state in term 2
TestEventList_EventBufOrder - 2019/12/30 18:54:49.138486 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/12/30 18:54:49.138671 [DEBUG] agent: new event: foo (fa06bcd8-3626-37e0-c708-347c15b08ce0)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.140300 [DEBUG] consul: User event: bar
TestEventList_EventBufOrder - 2019/12/30 18:54:49.140456 [DEBUG] agent: new event: bar (55f08a32-7830-4fe0-7eb4-10c5700bda2d)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.141799 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/12/30 18:54:49.141960 [DEBUG] agent: new event: foo (15fff084-2c22-c323-28d5-9af12d3c4635)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.143230 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/12/30 18:54:49.143373 [DEBUG] agent: new event: foo (4ddb55c6-7796-d24f-97d0-c6e59d03bdba)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.144512 [DEBUG] consul: User event: bar
TestEventList_EventBufOrder - 2019/12/30 18:54:49.144665 [DEBUG] agent: new event: bar (a9d3feb1-03bb-8df3-61bf-004258d67104)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.145036 [INFO] agent: Requesting shutdown
TestEventList_EventBufOrder - 2019/12/30 18:54:49.146002 [INFO] consul: shutting down server
TestEventList_EventBufOrder - 2019/12/30 18:54:49.146222 [WARN] serf: Shutdown without a Leave
TestEventList_EventBufOrder - 2019/12/30 18:54:49.282622 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f3be110f-c9d4-535b-320a-738c68af895e Address:127.0.0.1:18178}]
2019/12/30 18:54:49 [INFO]  raft: Node at 127.0.0.1:18178 [Follower] entering Follower state (Leader: "")
TestEventList_Filter - 2019/12/30 18:54:49.287139 [INFO] serf: EventMemberJoin: Node f3be110f-c9d4-535b-320a-738c68af895e.dc1 127.0.0.1
TestEventList_Filter - 2019/12/30 18:54:49.291287 [INFO] serf: EventMemberJoin: Node f3be110f-c9d4-535b-320a-738c68af895e 127.0.0.1
TestEventList_Filter - 2019/12/30 18:54:49.291913 [INFO] consul: Handled member-join event for server "Node f3be110f-c9d4-535b-320a-738c68af895e.dc1" in area "wan"
TestEventList_Filter - 2019/12/30 18:54:49.292126 [INFO] consul: Adding LAN server Node f3be110f-c9d4-535b-320a-738c68af895e (Addr: tcp/127.0.0.1:18178) (DC: dc1)
TestEventList_Filter - 2019/12/30 18:54:49.292529 [INFO] agent: Started DNS server 127.0.0.1:18173 (tcp)
TestEventList_Filter - 2019/12/30 18:54:49.292623 [INFO] agent: Started DNS server 127.0.0.1:18173 (udp)
TestEventList_Filter - 2019/12/30 18:54:49.295179 [INFO] agent: Started HTTP server on 127.0.0.1:18174 (tcp)
TestEventList_Filter - 2019/12/30 18:54:49.295282 [INFO] agent: started state syncer
2019/12/30 18:54:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:49 [INFO]  raft: Node at 127.0.0.1:18178 [Candidate] entering Candidate state in term 2
TestEventList_EventBufOrder - 2019/12/30 18:54:49.392845 [INFO] manager: shutting down
TestEventList_EventBufOrder - 2019/12/30 18:54:49.393578 [INFO] agent: consul server down
TestEventList_EventBufOrder - 2019/12/30 18:54:49.393649 [INFO] agent: shutdown complete
TestEventList_EventBufOrder - 2019/12/30 18:54:49.393751 [INFO] agent: Stopping DNS server 127.0.0.1:18149 (tcp)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.393927 [INFO] agent: Stopping DNS server 127.0.0.1:18149 (udp)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.394131 [INFO] agent: Stopping HTTP server 127.0.0.1:18150 (tcp)
TestEventList_EventBufOrder - 2019/12/30 18:54:49.394470 [INFO] agent: Waiting for endpoints to shut down
TestEventList_EventBufOrder - 2019/12/30 18:54:49.394576 [INFO] agent: Endpoints down
--- PASS: TestEventList_EventBufOrder (5.29s)
=== CONT  TestEventList
WARNING: bootstrap = true: do not enable unless necessary
TestEventList - 2019/12/30 18:54:49.475351 [WARN] agent: Node name "Node c46f46ba-f3a7-5d46-b149-2181ee989c49" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList - 2019/12/30 18:54:49.476071 [DEBUG] tlsutil: Update with version 1
TestEventList - 2019/12/30 18:54:49.482844 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_Blocking - 2019/12/30 18:54:49.550509 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_Blocking - 2019/12/30 18:54:49.550984 [DEBUG] consul: Skipping self join check for "Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103" since the cluster is too small
TestEventList_Blocking - 2019/12/30 18:54:49.551158 [INFO] consul: member 'Node cd8824b2-ab1d-f405-a4ec-a4fa806b2103' joined, marking health alive
2019/12/30 18:54:49 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:49 [INFO]  raft: Node at 127.0.0.1:18172 [Leader] entering Leader state
TestEventList_ACLFilter - 2019/12/30 18:54:49.643805 [INFO] consul: cluster leadership acquired
TestEventList_ACLFilter - 2019/12/30 18:54:49.644200 [INFO] consul: New leader elected: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4
jones - 2019/12/30 18:54:49.736061 [DEBUG] consul: Skipping self join check for "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" since the cluster is too small
TestEventList_ACLFilter - 2019/12/30 18:54:49.738209 [ERR] agent: failed to sync remote state: ACL not found
TestEventList_Blocking - 2019/12/30 18:54:49.763325 [DEBUG] consul: User event: test
TestEventList_Blocking - 2019/12/30 18:54:49.763504 [DEBUG] agent: new event: test (15282917-13d5-fea0-00d9-c0b9703077e6)
TestEventList_ACLFilter - 2019/12/30 18:54:49.797971 [INFO] acl: initializing acls
TestEventList_Blocking - 2019/12/30 18:54:49.814250 [DEBUG] consul: User event: second
TestEventList_Blocking - 2019/12/30 18:54:49.814528 [DEBUG] agent: new event: second (73886815-b485-3aaa-e018-7c1f8785efbf)
TestEventList_Blocking - 2019/12/30 18:54:49.814994 [INFO] agent: Requesting shutdown
TestEventList_Blocking - 2019/12/30 18:54:49.815081 [INFO] consul: shutting down server
TestEventList_Blocking - 2019/12/30 18:54:49.815134 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:49 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:49 [INFO]  raft: Node at 127.0.0.1:18178 [Leader] entering Leader state
TestEventList_Filter - 2019/12/30 18:54:49.869479 [INFO] consul: cluster leadership acquired
TestEventList_Filter - 2019/12/30 18:54:49.869933 [INFO] consul: New leader elected: Node f3be110f-c9d4-535b-320a-738c68af895e
TestEventList_Blocking - 2019/12/30 18:54:49.965869 [WARN] serf: Shutdown without a Leave
TestEventList_Blocking - 2019/12/30 18:54:50.095831 [INFO] manager: shutting down
TestEventList_Blocking - 2019/12/30 18:54:50.096534 [INFO] agent: consul server down
TestEventList_Blocking - 2019/12/30 18:54:50.096614 [INFO] agent: shutdown complete
TestEventList_Blocking - 2019/12/30 18:54:50.096684 [INFO] agent: Stopping DNS server 127.0.0.1:18161 (tcp)
TestEventList_Blocking - 2019/12/30 18:54:50.096880 [INFO] agent: Stopping DNS server 127.0.0.1:18161 (udp)
TestEventList_Blocking - 2019/12/30 18:54:50.097105 [INFO] agent: Stopping HTTP server 127.0.0.1:18162 (tcp)
TestEventList_ACLFilter - 2019/12/30 18:54:50.098275 [INFO] consul: Created ACL 'global-management' policy
TestEventList_ACLFilter - 2019/12/30 18:54:50.098363 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_Blocking - 2019/12/30 18:54:50.109894 [INFO] agent: Waiting for endpoints to shut down
TestEventList_Blocking - 2019/12/30 18:54:50.110035 [INFO] agent: Endpoints down
--- PASS: TestEventList_Blocking (5.28s)
=== CONT  TestEventFire_token
TestEventList_ACLFilter - 2019/12/30 18:54:50.119058 [INFO] acl: initializing acls
TestEventList_ACLFilter - 2019/12/30 18:54:50.119278 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_ACLFilter - 2019/12/30 18:54:50.157944 [ERR] agent: failed to sync remote state: ACL not found
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestEventFire_token - 2019/12/30 18:54:50.258193 [WARN] agent: Node name "Node 88149a31-421c-297f-f289-64fe65dc4fa8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventFire_token - 2019/12/30 18:54:50.259077 [DEBUG] tlsutil: Update with version 1
TestEventFire_token - 2019/12/30 18:54:50.263207 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:54:51.044905 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 5122c9d8-8979-c841-956f-094a90e62880.dc1 (Addr: tcp/127.0.0.1:17722) (DC: dc1)
TestEventList_Filter - 2019/12/30 18:54:51.851176 [INFO] agent: Synced node info
2019/12/30 18:54:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c46f46ba-f3a7-5d46-b149-2181ee989c49 Address:127.0.0.1:18184}]
2019/12/30 18:54:52 [INFO]  raft: Node at 127.0.0.1:18184 [Follower] entering Follower state (Leader: "")
TestEventList - 2019/12/30 18:54:52.138236 [INFO] serf: EventMemberJoin: Node c46f46ba-f3a7-5d46-b149-2181ee989c49.dc1 127.0.0.1
TestEventList - 2019/12/30 18:54:52.143429 [INFO] serf: EventMemberJoin: Node c46f46ba-f3a7-5d46-b149-2181ee989c49 127.0.0.1
TestEventList - 2019/12/30 18:54:52.144675 [INFO] consul: Adding LAN server Node c46f46ba-f3a7-5d46-b149-2181ee989c49 (Addr: tcp/127.0.0.1:18184) (DC: dc1)
TestEventList - 2019/12/30 18:54:52.145148 [INFO] consul: Handled member-join event for server "Node c46f46ba-f3a7-5d46-b149-2181ee989c49.dc1" in area "wan"
TestEventList - 2019/12/30 18:54:52.147039 [INFO] agent: Started DNS server 127.0.0.1:18179 (tcp)
TestEventList - 2019/12/30 18:54:52.147495 [INFO] agent: Started DNS server 127.0.0.1:18179 (udp)
TestEventList - 2019/12/30 18:54:52.151345 [INFO] agent: Started HTTP server on 127.0.0.1:18180 (tcp)
TestEventList - 2019/12/30 18:54:52.151454 [INFO] agent: started state syncer
2019/12/30 18:54:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:52 [INFO]  raft: Node at 127.0.0.1:18184 [Candidate] entering Candidate state in term 2
TestEventList_ACLFilter - 2019/12/30 18:54:52.242453 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventList_ACLFilter - 2019/12/30 18:54:52.242620 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventList_Filter - 2019/12/30 18:54:52.354222 [DEBUG] agent: Node info in sync
TestEventList_Filter - 2019/12/30 18:54:52.354339 [DEBUG] agent: Node info in sync
TestEventList_ACLFilter - 2019/12/30 18:54:52.570031 [INFO] consul: Created ACL anonymous token from configuration
TestEventList_ACLFilter - 2019/12/30 18:54:52.570878 [INFO] serf: EventMemberUpdate: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4
TestEventList_ACLFilter - 2019/12/30 18:54:52.571452 [INFO] serf: EventMemberUpdate: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4.dc1
TestEventList_ACLFilter - 2019/12/30 18:54:52.573563 [INFO] consul: Created ACL anonymous token from configuration
TestEventList_ACLFilter - 2019/12/30 18:54:52.573631 [DEBUG] acl: transitioning out of legacy ACL mode
TestEventList_ACLFilter - 2019/12/30 18:54:52.574450 [INFO] serf: EventMemberUpdate: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4
TestEventList_ACLFilter - 2019/12/30 18:54:52.575043 [INFO] serf: EventMemberUpdate: Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4.dc1
2019/12/30 18:54:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:88149a31-421c-297f-f289-64fe65dc4fa8 Address:127.0.0.1:18190}]
2019/12/30 18:54:52 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:52 [INFO]  raft: Node at 127.0.0.1:18184 [Leader] entering Leader state
TestEventFire_token - 2019/12/30 18:54:52.744733 [INFO] serf: EventMemberJoin: Node 88149a31-421c-297f-f289-64fe65dc4fa8.dc1 127.0.0.1
TestEventList - 2019/12/30 18:54:52.745604 [INFO] consul: cluster leadership acquired
TestEventList - 2019/12/30 18:54:52.746056 [INFO] consul: New leader elected: Node c46f46ba-f3a7-5d46-b149-2181ee989c49
2019/12/30 18:54:52 [INFO]  raft: Node at 127.0.0.1:18190 [Follower] entering Follower state (Leader: "")
TestEventFire_token - 2019/12/30 18:54:52.748076 [INFO] serf: EventMemberJoin: Node 88149a31-421c-297f-f289-64fe65dc4fa8 127.0.0.1
TestEventFire_token - 2019/12/30 18:54:52.748728 [INFO] consul: Handled member-join event for server "Node 88149a31-421c-297f-f289-64fe65dc4fa8.dc1" in area "wan"
TestEventFire_token - 2019/12/30 18:54:52.748996 [INFO] consul: Adding LAN server Node 88149a31-421c-297f-f289-64fe65dc4fa8 (Addr: tcp/127.0.0.1:18190) (DC: dc1)
TestEventFire_token - 2019/12/30 18:54:52.749440 [INFO] agent: Started DNS server 127.0.0.1:18185 (udp)
TestEventFire_token - 2019/12/30 18:54:52.749594 [INFO] agent: Started DNS server 127.0.0.1:18185 (tcp)
TestEventFire_token - 2019/12/30 18:54:52.752313 [INFO] agent: Started HTTP server on 127.0.0.1:18186 (tcp)
TestEventFire_token - 2019/12/30 18:54:52.752666 [INFO] agent: started state syncer
2019/12/30 18:54:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:52 [INFO]  raft: Node at 127.0.0.1:18190 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:54:53.283003 [DEBUG] consul: Skipping self join check for "Node 5122c9d8-8979-c841-956f-094a90e62880" since the cluster is too small
TestEventList - 2019/12/30 18:54:53.283478 [INFO] agent: Synced node info
TestEventList_Filter - 2019/12/30 18:54:53.284279 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_Filter - 2019/12/30 18:54:53.284799 [DEBUG] consul: Skipping self join check for "Node f3be110f-c9d4-535b-320a-738c68af895e" since the cluster is too small
TestEventList_Filter - 2019/12/30 18:54:53.285051 [INFO] consul: member 'Node f3be110f-c9d4-535b-320a-738c68af895e' joined, marking health alive
2019/12/30 18:54:53 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:53 [INFO]  raft: Node at 127.0.0.1:18190 [Leader] entering Leader state
TestEventFire_token - 2019/12/30 18:54:53.511973 [INFO] consul: cluster leadership acquired
TestEventFire_token - 2019/12/30 18:54:53.512430 [INFO] consul: New leader elected: Node 88149a31-421c-297f-f289-64fe65dc4fa8
TestEventList_Filter - 2019/12/30 18:54:53.516007 [DEBUG] consul: User event: test
TestEventList_Filter - 2019/12/30 18:54:53.516110 [DEBUG] consul: User event: foo
TestEventList_Filter - 2019/12/30 18:54:53.516250 [DEBUG] agent: new event: test (85aef772-9779-ffbe-363e-513c91b608fa)
TestEventList_Filter - 2019/12/30 18:54:53.516352 [DEBUG] agent: new event: foo (9500aef0-9790-95ee-865e-c1a112d16a2a)
TestEventList_Filter - 2019/12/30 18:54:53.541404 [INFO] agent: Requesting shutdown
TestEventList_Filter - 2019/12/30 18:54:53.541526 [INFO] consul: shutting down server
TestEventList_Filter - 2019/12/30 18:54:53.541578 [WARN] serf: Shutdown without a Leave
TestEventList_Filter - 2019/12/30 18:54:53.712360 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/30 18:54:53.714090 [ERR] agent: failed to sync remote state: ACL not found
TestEventList_Filter - 2019/12/30 18:54:53.782702 [INFO] manager: shutting down
TestEventList_Filter - 2019/12/30 18:54:53.783255 [INFO] agent: consul server down
TestEventList_Filter - 2019/12/30 18:54:53.783333 [INFO] agent: shutdown complete
TestEventList_Filter - 2019/12/30 18:54:53.783392 [INFO] agent: Stopping DNS server 127.0.0.1:18173 (tcp)
TestEventList_Filter - 2019/12/30 18:54:53.783544 [INFO] agent: Stopping DNS server 127.0.0.1:18173 (udp)
TestEventList_Filter - 2019/12/30 18:54:53.783728 [INFO] agent: Stopping HTTP server 127.0.0.1:18174 (tcp)
TestEventList_Filter - 2019/12/30 18:54:53.783959 [INFO] agent: Waiting for endpoints to shut down
TestEventList_Filter - 2019/12/30 18:54:53.784044 [INFO] agent: Endpoints down
--- PASS: TestEventList_Filter (6.09s)
=== CONT  TestEventFire
WARNING: bootstrap = true: do not enable unless necessary
TestEventFire - 2019/12/30 18:54:53.868776 [WARN] agent: Node name "Node 40753e30-3232-fae4-e6a6-cc215b3afb99" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventFire - 2019/12/30 18:54:53.869362 [DEBUG] tlsutil: Update with version 1
TestEventFire - 2019/12/30 18:54:53.872945 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/30 18:54:53.884201 [INFO] acl: initializing acls
TestEventList_ACLFilter - 2019/12/30 18:54:53.983589 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_ACLFilter - 2019/12/30 18:54:53.984112 [DEBUG] consul: Skipping self join check for "Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4" since the cluster is too small
TestEventList_ACLFilter - 2019/12/30 18:54:53.984223 [INFO] consul: member 'Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4' joined, marking health alive
TestEventFire_token - 2019/12/30 18:54:54.075903 [INFO] consul: Created ACL 'global-management' policy
TestEventFire_token - 2019/12/30 18:54:54.076001 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_ACLFilter - 2019/12/30 18:54:54.178486 [DEBUG] consul: Skipping self join check for "Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4" since the cluster is too small
TestEventList_ACLFilter - 2019/12/30 18:54:54.179089 [DEBUG] consul: Skipping self join check for "Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4" since the cluster is too small
TestEventList_ACLFilter - 2019/12/30 18:54:54.183111 [DEBUG] consul: dropping node "Node 72f69057-a23d-ff1b-4f07-c7769a47ebb4" from result due to ACLs
TestEventList_ACLFilter - 2019/12/30 18:54:54.188449 [INFO] agent: Requesting shutdown
TestEventList_ACLFilter - 2019/12/30 18:54:54.188757 [INFO] consul: shutting down server
TestEventList_ACLFilter - 2019/12/30 18:54:54.188959 [WARN] serf: Shutdown without a Leave
TestEventList_ACLFilter - 2019/12/30 18:54:54.188507 [DEBUG] consul: User event: foo
TestEventList_ACLFilter - 2019/12/30 18:54:54.189558 [DEBUG] agent: new event: foo (16cef0e4-69c5-1edd-e0a1-d931fcfc0d8d)
TestEventFire_token - 2019/12/30 18:54:54.250710 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventFire_token - 2019/12/30 18:54:54.300804 [INFO] acl: initializing acls
TestEventFire_token - 2019/12/30 18:54:54.301051 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_ACLFilter - 2019/12/30 18:54:54.324207 [WARN] serf: Shutdown without a Leave
TestEventList_ACLFilter - 2019/12/30 18:54:54.391038 [INFO] manager: shutting down
TestEventList_ACLFilter - 2019/12/30 18:54:54.391395 [INFO] agent: consul server down
TestEventList_ACLFilter - 2019/12/30 18:54:54.391444 [INFO] agent: shutdown complete
TestEventList_ACLFilter - 2019/12/30 18:54:54.391513 [INFO] agent: Stopping DNS server 127.0.0.1:18167 (tcp)
TestEventList_ACLFilter - 2019/12/30 18:54:54.391707 [INFO] agent: Stopping DNS server 127.0.0.1:18167 (udp)
TestEventList_ACLFilter - 2019/12/30 18:54:54.391892 [INFO] agent: Stopping HTTP server 127.0.0.1:18168 (tcp)
TestEventList_ACLFilter - 2019/12/30 18:54:54.392137 [INFO] agent: Waiting for endpoints to shut down
TestEventList_ACLFilter - 2019/12/30 18:54:54.392216 [INFO] agent: Endpoints down
--- PASS: TestEventList_ACLFilter (6.88s)
=== CONT  TestDNS_ReloadConfig_DuringQuery
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:54.536881 [WARN] agent: Node name "Node d6fb4e07-6e92-c8dc-720d-f1f594cb9f77" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:54.538909 [DEBUG] tlsutil: Update with version 1
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:54.544947 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList - 2019/12/30 18:54:54.602446 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList - 2019/12/30 18:54:54.602899 [DEBUG] consul: Skipping self join check for "Node c46f46ba-f3a7-5d46-b149-2181ee989c49" since the cluster is too small
TestEventList - 2019/12/30 18:54:54.603038 [INFO] consul: member 'Node c46f46ba-f3a7-5d46-b149-2181ee989c49' joined, marking health alive
jones - 2019/12/30 18:54:54.672888 [DEBUG] manager: Rebalanced 1 servers, next active server is Node a8b3e297-b53a-bcd0-efda-5addcd938805.dc1 (Addr: tcp/127.0.0.1:17728) (DC: dc1)
TestEventFire_token - 2019/12/30 18:54:54.868424 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/30 18:54:54.868559 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/30 18:54:54.868678 [INFO] consul: Created ACL anonymous token from configuration
TestEventFire_token - 2019/12/30 18:54:54.869862 [INFO] serf: EventMemberUpdate: Node 88149a31-421c-297f-f289-64fe65dc4fa8
TestEventFire_token - 2019/12/30 18:54:54.870614 [INFO] serf: EventMemberUpdate: Node 88149a31-421c-297f-f289-64fe65dc4fa8.dc1
TestEventFire_token - 2019/12/30 18:54:54.870959 [INFO] consul: Created ACL anonymous token from configuration
TestEventFire_token - 2019/12/30 18:54:54.871018 [DEBUG] acl: transitioning out of legacy ACL mode
TestEventFire_token - 2019/12/30 18:54:54.871750 [INFO] serf: EventMemberUpdate: Node 88149a31-421c-297f-f289-64fe65dc4fa8
TestEventFire_token - 2019/12/30 18:54:54.872356 [INFO] serf: EventMemberUpdate: Node 88149a31-421c-297f-f289-64fe65dc4fa8.dc1
TestEventFire_token - 2019/12/30 18:54:54.876859 [DEBUG] consul: dropping node "Node 88149a31-421c-297f-f289-64fe65dc4fa8" from result due to ACLs
TestEventFire_token - 2019/12/30 18:54:54.877314 [DEBUG] consul: dropping node "Node 88149a31-421c-297f-f289-64fe65dc4fa8" from result due to ACLs
TestEventList - 2019/12/30 18:54:54.879936 [DEBUG] consul: User event: test
TestEventList - 2019/12/30 18:54:54.880141 [DEBUG] agent: new event: test (f08dc7fd-4c3d-89f7-7e1c-cc00710b07fc)
TestEventList - 2019/12/30 18:54:54.904915 [INFO] agent: Requesting shutdown
TestEventList - 2019/12/30 18:54:54.905018 [INFO] consul: shutting down server
TestEventList - 2019/12/30 18:54:54.905070 [WARN] serf: Shutdown without a Leave
TestEventList - 2019/12/30 18:54:55.049324 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:40753e30-3232-fae4-e6a6-cc215b3afb99 Address:127.0.0.1:18196}]
TestEventList - 2019/12/30 18:54:55.126559 [INFO] manager: shutting down
TestEventList - 2019/12/30 18:54:55.126952 [INFO] agent: consul server down
TestEventList - 2019/12/30 18:54:55.126995 [INFO] agent: shutdown complete
TestEventList - 2019/12/30 18:54:55.127043 [INFO] agent: Stopping DNS server 127.0.0.1:18179 (tcp)
TestEventList - 2019/12/30 18:54:55.127158 [INFO] agent: Stopping DNS server 127.0.0.1:18179 (udp)
TestEventList - 2019/12/30 18:54:55.127295 [INFO] agent: Stopping HTTP server 127.0.0.1:18180 (tcp)
TestEventList - 2019/12/30 18:54:55.127465 [INFO] agent: Waiting for endpoints to shut down
TestEventList - 2019/12/30 18:54:55.127533 [INFO] agent: Endpoints down
--- PASS: TestEventList (5.73s)
=== CONT  TestDNS_ConfigReload
2019/12/30 18:54:55 [INFO]  raft: Node at 127.0.0.1:18196 [Follower] entering Follower state (Leader: "")
TestEventFire - 2019/12/30 18:54:55.131546 [INFO] serf: EventMemberJoin: Node 40753e30-3232-fae4-e6a6-cc215b3afb99.dc1 127.0.0.1
TestEventFire - 2019/12/30 18:54:55.137786 [INFO] serf: EventMemberJoin: Node 40753e30-3232-fae4-e6a6-cc215b3afb99 127.0.0.1
TestEventFire - 2019/12/30 18:54:55.138660 [INFO] consul: Adding LAN server Node 40753e30-3232-fae4-e6a6-cc215b3afb99 (Addr: tcp/127.0.0.1:18196) (DC: dc1)
TestEventFire - 2019/12/30 18:54:55.139245 [INFO] consul: Handled member-join event for server "Node 40753e30-3232-fae4-e6a6-cc215b3afb99.dc1" in area "wan"
TestEventFire - 2019/12/30 18:54:55.140297 [INFO] agent: Started DNS server 127.0.0.1:18191 (tcp)
TestEventFire - 2019/12/30 18:54:55.140619 [INFO] agent: Started DNS server 127.0.0.1:18191 (udp)
TestEventFire - 2019/12/30 18:54:55.142833 [INFO] agent: Started HTTP server on 127.0.0.1:18192 (tcp)
TestEventFire - 2019/12/30 18:54:55.142928 [INFO] agent: started state syncer
2019/12/30 18:54:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:55 [INFO]  raft: Node at 127.0.0.1:18196 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ConfigReload - 2019/12/30 18:54:55.220366 [WARN] agent: Node name "Node de87c8e3-2447-2709-d69c-888a1e0464db" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ConfigReload - 2019/12/30 18:54:55.220834 [DEBUG] tlsutil: Update with version 1
TestDNS_ConfigReload - 2019/12/30 18:54:55.222987 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/30 18:54:55.406340 [WARN] consul: user event "foo" blocked by ACLs
TestEventFire_token - 2019/12/30 18:54:55.407181 [WARN] consul: user event "bar" blocked by ACLs
TestEventFire_token - 2019/12/30 18:54:55.409877 [INFO] agent: Requesting shutdown
TestEventFire_token - 2019/12/30 18:54:55.410146 [INFO] consul: shutting down server
TestEventFire_token - 2019/12/30 18:54:55.410322 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/30 18:54:55.554467 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d6fb4e07-6e92-c8dc-720d-f1f594cb9f77 Address:127.0.0.1:18202}]
TestEventFire_token - 2019/12/30 18:54:55.658219 [INFO] manager: shutting down
2019/12/30 18:54:55 [INFO]  raft: Node at 127.0.0.1:18202 [Follower] entering Follower state (Leader: "")
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.661414 [INFO] serf: EventMemberJoin: Node d6fb4e07-6e92-c8dc-720d-f1f594cb9f77.dc1 127.0.0.1
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.667811 [INFO] serf: EventMemberJoin: Node d6fb4e07-6e92-c8dc-720d-f1f594cb9f77 127.0.0.1
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.670268 [INFO] consul: Adding LAN server Node d6fb4e07-6e92-c8dc-720d-f1f594cb9f77 (Addr: tcp/127.0.0.1:18202) (DC: dc1)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.670906 [INFO] consul: Handled member-join event for server "Node d6fb4e07-6e92-c8dc-720d-f1f594cb9f77.dc1" in area "wan"
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.680510 [INFO] agent: Started DNS server 127.0.0.1:18197 (tcp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.680803 [INFO] agent: Started DNS server 127.0.0.1:18197 (udp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.688304 [INFO] agent: Started HTTP server on 127.0.0.1:18198 (tcp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:55.688400 [INFO] agent: started state syncer
2019/12/30 18:54:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:55 [INFO]  raft: Node at 127.0.0.1:18202 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:54:55.793449 [DEBUG] consul: Skipping self join check for "Node a8b3e297-b53a-bcd0-efda-5addcd938805" since the cluster is too small
TestEventFire_token - 2019/12/30 18:54:55.871677 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/30 18:54:55 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:55 [INFO]  raft: Node at 127.0.0.1:18196 [Leader] entering Leader state
TestEventFire_token - 2019/12/30 18:54:55.891238 [INFO] agent: consul server down
TestEventFire_token - 2019/12/30 18:54:55.891281 [INFO] agent: shutdown complete
TestEventFire_token - 2019/12/30 18:54:55.891333 [INFO] agent: Stopping DNS server 127.0.0.1:18185 (tcp)
TestEventFire_token - 2019/12/30 18:54:55.891468 [INFO] agent: Stopping DNS server 127.0.0.1:18185 (udp)
TestEventFire_token - 2019/12/30 18:54:55.891629 [INFO] agent: Stopping HTTP server 127.0.0.1:18186 (tcp)
TestEventFire_token - 2019/12/30 18:54:55.891836 [INFO] agent: Waiting for endpoints to shut down
TestEventFire_token - 2019/12/30 18:54:55.891904 [INFO] agent: Endpoints down
--- PASS: TestEventFire_token (5.78s)
=== CONT  TestDNS_Compression_Recurse
TestEventFire_token - 2019/12/30 18:54:55.899524 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestEventFire - 2019/12/30 18:54:55.899914 [INFO] consul: cluster leadership acquired
TestEventFire - 2019/12/30 18:54:55.900278 [INFO] consul: New leader elected: Node 40753e30-3232-fae4-e6a6-cc215b3afb99
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_Recurse - 2019/12/30 18:54:56.025669 [WARN] agent: Node name "Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_Recurse - 2019/12/30 18:54:56.026051 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_Recurse - 2019/12/30 18:54:56.028310 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire - 2019/12/30 18:54:56.563985 [INFO] agent: Synced node info
2019/12/30 18:54:56 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:56 [INFO]  raft: Node at 127.0.0.1:18202 [Leader] entering Leader state
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:56.839853 [INFO] consul: cluster leadership acquired
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:56.840299 [INFO] consul: New leader elected: Node d6fb4e07-6e92-c8dc-720d-f1f594cb9f77
2019/12/30 18:54:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:de87c8e3-2447-2709-d69c-888a1e0464db Address:127.0.0.1:18208}]
2019/12/30 18:54:57 [INFO]  raft: Node at 127.0.0.1:18208 [Follower] entering Follower state (Leader: "")
TestDNS_ConfigReload - 2019/12/30 18:54:57.037581 [INFO] serf: EventMemberJoin: Node de87c8e3-2447-2709-d69c-888a1e0464db.dc1 127.0.0.1
TestDNS_ConfigReload - 2019/12/30 18:54:57.042150 [INFO] serf: EventMemberJoin: Node de87c8e3-2447-2709-d69c-888a1e0464db 127.0.0.1
TestDNS_ConfigReload - 2019/12/30 18:54:57.043051 [DEBUG] dns: recursor enabled
TestDNS_ConfigReload - 2019/12/30 18:54:57.043581 [INFO] agent: Started DNS server 127.0.0.1:18203 (udp)
TestDNS_ConfigReload - 2019/12/30 18:54:57.044664 [INFO] consul: Adding LAN server Node de87c8e3-2447-2709-d69c-888a1e0464db (Addr: tcp/127.0.0.1:18208) (DC: dc1)
TestDNS_ConfigReload - 2019/12/30 18:54:57.044903 [INFO] consul: Handled member-join event for server "Node de87c8e3-2447-2709-d69c-888a1e0464db.dc1" in area "wan"
TestDNS_ConfigReload - 2019/12/30 18:54:57.045016 [DEBUG] dns: recursor enabled
TestDNS_ConfigReload - 2019/12/30 18:54:57.045323 [INFO] agent: Started DNS server 127.0.0.1:18203 (tcp)
TestDNS_ConfigReload - 2019/12/30 18:54:57.047646 [INFO] agent: Started HTTP server on 127.0.0.1:18204 (tcp)
TestDNS_ConfigReload - 2019/12/30 18:54:57.047784 [INFO] agent: started state syncer
2019/12/30 18:54:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:57 [INFO]  raft: Node at 127.0.0.1:18208 [Candidate] entering Candidate state in term 2
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.383769 [INFO] agent: Synced node info
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.387460 [WARN] consul: endpoint injected; this should only be used for testing
jones - 2019/12/30 18:54:57.398560 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d.dc1 (Addr: tcp/127.0.0.1:17734) (DC: dc1)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.438438 [DEBUG] tlsutil: Update with version 2
2019/12/30 18:54:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8ffd5f14-b41f-71a6-8bbb-e532d9d266ae Address:127.0.0.1:18214}]
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.481753 [INFO] serf: EventMemberJoin: Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae.dc1 127.0.0.1
2019/12/30 18:54:57 [INFO]  raft: Node at 127.0.0.1:18214 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.489563 [INFO] serf: EventMemberJoin: Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae 127.0.0.1
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.490365 [DEBUG] dns: request for name nope.query.consul. type A class IN (took 100.662994ms) from client 127.0.0.1:41371 (udp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.490851 [INFO] agent: Requesting shutdown
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.491298 [INFO] consul: shutting down server
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.491344 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.495879 [INFO] consul: Adding LAN server Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae (Addr: tcp/127.0.0.1:18214) (DC: dc1)
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.496503 [INFO] consul: Handled member-join event for server "Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae.dc1" in area "wan"
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.497990 [DEBUG] dns: recursor enabled
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.498527 [DEBUG] dns: recursor enabled
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.499177 [INFO] agent: Started DNS server 127.0.0.1:18209 (tcp)
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.499354 [INFO] agent: Started DNS server 127.0.0.1:18209 (udp)
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.502432 [INFO] agent: Started HTTP server on 127.0.0.1:18210 (tcp)
TestDNS_Compression_Recurse - 2019/12/30 18:54:57.502532 [INFO] agent: started state syncer
2019/12/30 18:54:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:57 [INFO]  raft: Node at 127.0.0.1:18214 [Candidate] entering Candidate state in term 2
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.657774 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:57 [INFO]  raft: Node at 127.0.0.1:18208 [Leader] entering Leader state
TestDNS_ConfigReload - 2019/12/30 18:54:57.658895 [INFO] consul: cluster leadership acquired
TestDNS_ConfigReload - 2019/12/30 18:54:57.659455 [INFO] consul: New leader elected: Node de87c8e3-2447-2709-d69c-888a1e0464db
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:57.841921 [INFO] manager: shutting down
TestEventFire - 2019/12/30 18:54:58.087562 [DEBUG] agent: Node info in sync
TestEventFire - 2019/12/30 18:54:58.087679 [DEBUG] agent: Node info in sync
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.399819 [INFO] agent: consul server down
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.399919 [INFO] agent: shutdown complete
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.399988 [INFO] agent: Stopping DNS server 127.0.0.1:18197 (tcp)
jones - 2019/12/30 18:54:58.400060 [DEBUG] consul: Skipping self join check for "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" since the cluster is too small
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.400149 [INFO] agent: Stopping DNS server 127.0.0.1:18197 (udp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.400309 [INFO] agent: Stopping HTTP server 127.0.0.1:18198 (tcp)
TestEventFire - 2019/12/30 18:54:58.400454 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.400523 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.400595 [INFO] agent: Endpoints down
--- PASS: TestDNS_ReloadConfig_DuringQuery (4.01s)
=== CONT  TestDNS_Compression_ReverseLookup
TestEventFire - 2019/12/30 18:54:58.400824 [DEBUG] consul: Skipping self join check for "Node 40753e30-3232-fae4-e6a6-cc215b3afb99" since the cluster is too small
TestEventFire - 2019/12/30 18:54:58.400978 [INFO] consul: member 'Node 40753e30-3232-fae4-e6a6-cc215b3afb99' joined, marking health alive
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.404766 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_ReloadConfig_DuringQuery - 2019/12/30 18:54:58.405002 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:58.480305 [WARN] agent: Node name "Node 4abfc426-2c50-ac3b-d5e5-381330d675f3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:58.480949 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:58.483513 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ConfigReload - 2019/12/30 18:54:58.625456 [INFO] agent: Synced node info
TestDNS_ConfigReload - 2019/12/30 18:54:58.625586 [DEBUG] agent: Node info in sync
TestDNS_ConfigReload - 2019/12/30 18:54:58.643793 [DEBUG] tlsutil: Update with version 2
TestDNS_ConfigReload - 2019/12/30 18:54:58.645665 [INFO] agent: Requesting shutdown
TestDNS_ConfigReload - 2019/12/30 18:54:58.645779 [INFO] consul: shutting down server
TestDNS_ConfigReload - 2019/12/30 18:54:58.645830 [WARN] serf: Shutdown without a Leave
2019/12/30 18:54:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:58 [INFO]  raft: Node at 127.0.0.1:18214 [Leader] entering Leader state
TestDNS_Compression_Recurse - 2019/12/30 18:54:58.742840 [INFO] consul: cluster leadership acquired
TestDNS_Compression_Recurse - 2019/12/30 18:54:58.743308 [INFO] consul: New leader elected: Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae
TestEventFire - 2019/12/30 18:54:58.752261 [INFO] agent: Requesting shutdown
TestEventFire - 2019/12/30 18:54:58.752377 [INFO] consul: shutting down server
TestEventFire - 2019/12/30 18:54:58.752411 [DEBUG] consul: User event: test
TestEventFire - 2019/12/30 18:54:58.752426 [WARN] serf: Shutdown without a Leave
TestDNS_ConfigReload - 2019/12/30 18:54:58.847027 [WARN] serf: Shutdown without a Leave
TestEventFire - 2019/12/30 18:54:58.848532 [WARN] serf: Shutdown without a Leave
TestDNS_ConfigReload - 2019/12/30 18:54:58.928350 [INFO] manager: shutting down
TestEventFire - 2019/12/30 18:54:58.929320 [INFO] manager: shutting down
TestEventFire - 2019/12/30 18:54:58.930647 [INFO] agent: consul server down
TestEventFire - 2019/12/30 18:54:58.930711 [INFO] agent: shutdown complete
TestEventFire - 2019/12/30 18:54:58.930773 [INFO] agent: Stopping DNS server 127.0.0.1:18191 (tcp)
TestEventFire - 2019/12/30 18:54:58.931183 [INFO] agent: Stopping DNS server 127.0.0.1:18191 (udp)
TestEventFire - 2019/12/30 18:54:58.931510 [INFO] agent: Stopping HTTP server 127.0.0.1:18192 (tcp)
TestEventFire - 2019/12/30 18:54:58.931897 [INFO] agent: Waiting for endpoints to shut down
TestEventFire - 2019/12/30 18:54:58.932043 [INFO] agent: Endpoints down
=== CONT  TestDNS_Compression_Query
--- PASS: TestEventFire (5.15s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_Query - 2019/12/30 18:54:58.988967 [WARN] agent: Node name "Node 9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_Query - 2019/12/30 18:54:58.989513 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_Query - 2019/12/30 18:54:58.991739 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ConfigReload - 2019/12/30 18:54:59.000136 [INFO] agent: consul server down
TestDNS_ConfigReload - 2019/12/30 18:54:59.000233 [INFO] agent: shutdown complete
TestDNS_ConfigReload - 2019/12/30 18:54:59.000300 [INFO] agent: Stopping DNS server 127.0.0.1:18203 (tcp)
TestDNS_ConfigReload - 2019/12/30 18:54:59.000486 [INFO] agent: Stopping DNS server 127.0.0.1:18203 (udp)
TestDNS_ConfigReload - 2019/12/30 18:54:59.000684 [INFO] agent: Stopping HTTP server 127.0.0.1:18204 (tcp)
TestDNS_ConfigReload - 2019/12/30 18:54:59.000940 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ConfigReload - 2019/12/30 18:54:59.001022 [INFO] agent: Endpoints down
--- PASS: TestDNS_ConfigReload (3.87s)
=== CONT  TestDNS_Compression_trimUDPResponse
TestDNS_ConfigReload - 2019/12/30 18:54:59.002521 [ERR] consul: failed to establish leadership: leadership lost while committing log
--- PASS: TestDNS_Compression_trimUDPResponse (0.04s)
=== CONT  TestDNS_syncExtra
--- PASS: TestDNS_syncExtra (0.00s)
=== CONT  TestDNS_trimUDPResponse_TrimSizeEDNS
TestDNS_Compression_Recurse - 2019/12/30 18:54:59.075516 [INFO] agent: Synced node info
TestDNS_Compression_Recurse - 2019/12/30 18:54:59.075670 [DEBUG] agent: Node info in sync
--- PASS: TestDNS_trimUDPResponse_TrimSizeEDNS (0.04s)
=== CONT  TestDNS_trimUDPResponse_TrimSize
=== CONT  TestDNS_trimUDPResponse_TrimLimit
--- PASS: TestDNS_trimUDPResponse_TrimSize (0.05s)
=== CONT  TestDNS_PreparedQuery_AgentSource
--- PASS: TestDNS_trimUDPResponse_TrimLimit (0.05s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:54:59.256746 [WARN] agent: Node name "Node 2f1d373a-cf76-88e4-94e1-979db637064f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:54:59.257338 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:54:59.260089 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4abfc426-2c50-ac3b-d5e5-381330d675f3 Address:127.0.0.1:18220}]
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.412323 [INFO] serf: EventMemberJoin: Node 4abfc426-2c50-ac3b-d5e5-381330d675f3.dc1 127.0.0.1
2019/12/30 18:54:59 [INFO]  raft: Node at 127.0.0.1:18220 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.422669 [INFO] serf: EventMemberJoin: Node 4abfc426-2c50-ac3b-d5e5-381330d675f3 127.0.0.1
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.423663 [INFO] consul: Adding LAN server Node 4abfc426-2c50-ac3b-d5e5-381330d675f3 (Addr: tcp/127.0.0.1:18220) (DC: dc1)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.424047 [INFO] consul: Handled member-join event for server "Node 4abfc426-2c50-ac3b-d5e5-381330d675f3.dc1" in area "wan"
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.424211 [INFO] agent: Started DNS server 127.0.0.1:18215 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.424734 [INFO] agent: Started DNS server 127.0.0.1:18215 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.427237 [INFO] agent: Started HTTP server on 127.0.0.1:18216 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:54:59.427363 [INFO] agent: started state syncer
2019/12/30 18:54:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:59 [INFO]  raft: Node at 127.0.0.1:18220 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:54:59.514954 [DEBUG] manager: Rebalanced 1 servers, next active server is Node f632792c-c81a-fbfb-b7c4-e99bdb454ade.dc1 (Addr: tcp/127.0.0.1:17740) (DC: dc1)
TestDNS_Compression_Recurse - 2019/12/30 18:54:59.997608 [DEBUG] agent: Node info in sync
2019/12/30 18:55:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5 Address:127.0.0.1:18226}]
2019/12/30 18:55:00 [INFO]  raft: Node at 127.0.0.1:18226 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_Query - 2019/12/30 18:55:00.132510 [INFO] serf: EventMemberJoin: Node 9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5.dc1 127.0.0.1
TestDNS_Compression_Query - 2019/12/30 18:55:00.136258 [INFO] serf: EventMemberJoin: Node 9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5 127.0.0.1
TestDNS_Compression_Query - 2019/12/30 18:55:00.137349 [INFO] consul: Adding LAN server Node 9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5 (Addr: tcp/127.0.0.1:18226) (DC: dc1)
TestDNS_Compression_Query - 2019/12/30 18:55:00.137393 [INFO] consul: Handled member-join event for server "Node 9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5.dc1" in area "wan"
TestDNS_Compression_Query - 2019/12/30 18:55:00.159748 [INFO] agent: Started DNS server 127.0.0.1:18221 (tcp)
TestDNS_Compression_Query - 2019/12/30 18:55:00.160652 [INFO] agent: Started DNS server 127.0.0.1:18221 (udp)
TestDNS_Compression_Query - 2019/12/30 18:55:00.163217 [INFO] agent: Started HTTP server on 127.0.0.1:18222 (tcp)
TestDNS_Compression_Query - 2019/12/30 18:55:00.163326 [INFO] agent: started state syncer
2019/12/30 18:55:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:00 [INFO]  raft: Node at 127.0.0.1:18226 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:00 [INFO]  raft: Node at 127.0.0.1:18220 [Leader] entering Leader state
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:00.674562 [INFO] consul: cluster leadership acquired
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:00.675206 [INFO] consul: New leader elected: Node 4abfc426-2c50-ac3b-d5e5-381330d675f3
TestDNS_Compression_Recurse - 2019/12/30 18:55:01.450788 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_Compression_Recurse - 2019/12/30 18:55:01.451326 [DEBUG] consul: Skipping self join check for "Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae" since the cluster is too small
TestDNS_Compression_Recurse - 2019/12/30 18:55:01.451509 [INFO] consul: member 'Node 8ffd5f14-b41f-71a6-8bbb-e532d9d266ae' joined, marking health alive
TestDNS_Compression_Recurse - 2019/12/30 18:55:01.454538 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:55:01.455360 [DEBUG] consul: Skipping self join check for "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" since the cluster is too small
2019/12/30 18:55:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2f1d373a-cf76-88e4-94e1-979db637064f Address:127.0.0.1:18232}]
2019/12/30 18:55:01 [INFO]  raft: Node at 127.0.0.1:18232 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.766143 [INFO] serf: EventMemberJoin: Node 2f1d373a-cf76-88e4-94e1-979db637064f.dc1 127.0.0.1
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.770982 [INFO] serf: EventMemberJoin: Node 2f1d373a-cf76-88e4-94e1-979db637064f 127.0.0.1
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.775436 [INFO] agent: Started DNS server 127.0.0.1:18227 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.776002 [INFO] consul: Adding LAN server Node 2f1d373a-cf76-88e4-94e1-979db637064f (Addr: tcp/127.0.0.1:18232) (DC: dc1)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.777862 [INFO] agent: Started DNS server 127.0.0.1:18227 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.779747 [INFO] consul: Handled member-join event for server "Node 2f1d373a-cf76-88e4-94e1-979db637064f.dc1" in area "wan"
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.780446 [INFO] agent: Started HTTP server on 127.0.0.1:18228 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:01.780552 [INFO] agent: started state syncer
2019/12/30 18:55:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:01 [INFO]  raft: Node at 127.0.0.1:18232 [Candidate] entering Candidate state in term 2
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.275356 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (458.012µs) Recursor queried: 127.0.0.1:40501
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.275652 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.2477ms) from client 127.0.0.1:41717 (udp)
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.276667 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (355.01µs) Recursor queried: 127.0.0.1:40501
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.276867 [INFO] agent: Requesting shutdown
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.276960 [INFO] consul: shutting down server
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.277011 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.276889 [DEBUG] dns: request for {apple.com. 255 1} (udp) (952.358µs) from client 127.0.0.1:41717 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.385597 [INFO] agent: Synced node info
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.385725 [DEBUG] agent: Node info in sync
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.387670 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:02 [INFO]  raft: Election won. Tally: 1
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.474612 [INFO] manager: shutting down
2019/12/30 18:55:02 [INFO]  raft: Node at 127.0.0.1:18226 [Leader] entering Leader state
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.475533 [INFO] agent: consul server down
TestDNS_Compression_Query - 2019/12/30 18:55:02.475564 [INFO] consul: cluster leadership acquired
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.475589 [INFO] agent: shutdown complete
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.475646 [INFO] agent: Stopping DNS server 127.0.0.1:18209 (tcp)
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.475836 [INFO] agent: Stopping DNS server 127.0.0.1:18209 (udp)
TestDNS_Compression_Query - 2019/12/30 18:55:02.476002 [INFO] consul: New leader elected: Node 9b8dcad8-46fb-1c35-4ebf-0e96d1ba08f5
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.476010 [INFO] agent: Stopping HTTP server 127.0.0.1:18210 (tcp)
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.476284 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_Recurse - 2019/12/30 18:55:02.476374 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_Recurse (6.58s)
=== CONT  TestDNS_InvalidQueries
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_InvalidQueries - 2019/12/30 18:55:02.538955 [WARN] agent: Node name "Node eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_InvalidQueries - 2019/12/30 18:55:02.539493 [DEBUG] tlsutil: Update with version 1
TestDNS_InvalidQueries - 2019/12/30 18:55:02.541631 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:02 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:02 [INFO]  raft: Node at 127.0.0.1:18232 [Leader] entering Leader state
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:02.959280 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:02.959766 [INFO] consul: New leader elected: Node 2f1d373a-cf76-88e4-94e1-979db637064f
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.964813 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (579.682µs) from client 127.0.0.1:37117 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.965716 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (411.011µs) from client 127.0.0.1:37117 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.965832 [INFO] agent: Requesting shutdown
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.965885 [INFO] consul: shutting down server
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:02.965934 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Query - 2019/12/30 18:55:03.078440 [INFO] agent: Synced node info
TestDNS_Compression_Query - 2019/12/30 18:55:03.078586 [DEBUG] agent: Node info in sync
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.080674 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.208963 [INFO] manager: shutting down
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.349737 [INFO] agent: consul server down
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.349806 [INFO] agent: shutdown complete
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.349862 [INFO] agent: Stopping DNS server 127.0.0.1:18215 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.349994 [INFO] agent: Stopping DNS server 127.0.0.1:18215 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.350150 [INFO] agent: Stopping HTTP server 127.0.0.1:18216 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.350351 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.350418 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_ReverseLookup (4.95s)
=== CONT  TestDNS_PreparedQuery_AllowStale
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.354825 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Compression_ReverseLookup - 2019/12/30 18:55:03.355126 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:03.433855 [WARN] agent: Node name "Node 1e34a2e5-4bbf-2b9c-4502-c390494eba6b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:03.434476 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:03.439178 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.556606 [INFO] agent: Synced node info
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.569897 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.571423 [DEBUG] dns: request for name foo.query.consul. type SRV class IN (took 457.679µs) from client 127.0.0.1:36255 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.571692 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.571974 [INFO] consul: shutting down server
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.572021 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.670077 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.857877 [INFO] manager: shutting down
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862133 [INFO] agent: consul server down
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862202 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862260 [INFO] agent: Stopping DNS server 127.0.0.1:18227 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862391 [INFO] agent: Stopping DNS server 127.0.0.1:18227 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862538 [INFO] agent: Stopping HTTP server 127.0.0.1:18228 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862734 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.862800 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_AgentSource (4.68s)
=== CONT  TestDNS_AltDomains_Overlap
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.868745 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.869107 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.869261 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.869346 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.869457 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_PreparedQuery_AgentSource - 2019/12/30 18:55:03.869516 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:03.937799 [DEBUG] tlsutil: Update with version 1
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:03.940171 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c Address:127.0.0.1:18238}]
TestDNS_InvalidQueries - 2019/12/30 18:55:04.012468 [INFO] serf: EventMemberJoin: Node eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c.dc1 127.0.0.1
2019/12/30 18:55:04 [INFO]  raft: Node at 127.0.0.1:18238 [Follower] entering Follower state (Leader: "")
TestDNS_InvalidQueries - 2019/12/30 18:55:04.018097 [INFO] serf: EventMemberJoin: Node eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c 127.0.0.1
TestDNS_InvalidQueries - 2019/12/30 18:55:04.019307 [INFO] consul: Adding LAN server Node eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c (Addr: tcp/127.0.0.1:18238) (DC: dc1)
TestDNS_InvalidQueries - 2019/12/30 18:55:04.019869 [INFO] consul: Handled member-join event for server "Node eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c.dc1" in area "wan"
TestDNS_InvalidQueries - 2019/12/30 18:55:04.021115 [INFO] agent: Started DNS server 127.0.0.1:18233 (tcp)
TestDNS_InvalidQueries - 2019/12/30 18:55:04.021552 [INFO] agent: Started DNS server 127.0.0.1:18233 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:04.024736 [INFO] agent: Started HTTP server on 127.0.0.1:18234 (tcp)
TestDNS_InvalidQueries - 2019/12/30 18:55:04.024830 [INFO] agent: started state syncer
2019/12/30 18:55:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:04 [INFO]  raft: Node at 127.0.0.1:18238 [Candidate] entering Candidate state in term 2
TestDNS_Compression_Query - 2019/12/30 18:55:04.497183 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 922.358µs) from client 127.0.0.1:46928 (udp)
TestDNS_Compression_Query - 2019/12/30 18:55:04.499246 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 910.024µs) from client 127.0.0.1:46928 (udp)
TestDNS_Compression_Query - 2019/12/30 18:55:04.501480 [DEBUG] dns: request for name d285a242-5d4d-58e8-9417-68adc78e3dd6.query.consul. type SRV class IN (took 1.170365ms) from client 127.0.0.1:40195 (udp)
TestDNS_Compression_Query - 2019/12/30 18:55:04.504606 [INFO] agent: Requesting shutdown
TestDNS_Compression_Query - 2019/12/30 18:55:04.504693 [INFO] consul: shutting down server
TestDNS_Compression_Query - 2019/12/30 18:55:04.504742 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Query - 2019/12/30 18:55:04.504637 [DEBUG] dns: request for name d285a242-5d4d-58e8-9417-68adc78e3dd6.query.consul. type SRV class IN (took 2.244059ms) from client 127.0.0.1:40195 (udp)
2019/12/30 18:55:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1e34a2e5-4bbf-2b9c-4502-c390494eba6b Address:127.0.0.1:18244}]
2019/12/30 18:55:04 [INFO]  raft: Node at 127.0.0.1:18244 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.614798 [INFO] serf: EventMemberJoin: Node 1e34a2e5-4bbf-2b9c-4502-c390494eba6b.dc1 127.0.0.1
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.619006 [INFO] serf: EventMemberJoin: Node 1e34a2e5-4bbf-2b9c-4502-c390494eba6b 127.0.0.1
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.619764 [INFO] consul: Adding LAN server Node 1e34a2e5-4bbf-2b9c-4502-c390494eba6b (Addr: tcp/127.0.0.1:18244) (DC: dc1)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.620441 [INFO] consul: Handled member-join event for server "Node 1e34a2e5-4bbf-2b9c-4502-c390494eba6b.dc1" in area "wan"
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.620557 [INFO] agent: Started DNS server 127.0.0.1:18239 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.621057 [INFO] agent: Started DNS server 127.0.0.1:18239 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.623502 [INFO] agent: Started HTTP server on 127.0.0.1:18240 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:04.623598 [INFO] agent: started state syncer
2019/12/30 18:55:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:04 [INFO]  raft: Node at 127.0.0.1:18244 [Candidate] entering Candidate state in term 2
TestDNS_Compression_Query - 2019/12/30 18:55:04.716140 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:04 [INFO]  raft: Node at 127.0.0.1:18238 [Leader] entering Leader state
TestDNS_InvalidQueries - 2019/12/30 18:55:04.725349 [INFO] consul: cluster leadership acquired
TestDNS_InvalidQueries - 2019/12/30 18:55:04.725834 [INFO] consul: New leader elected: Node eb2d2cc2-b5b2-cd98-0120-3fd2a95c904c
TestDNS_Compression_Query - 2019/12/30 18:55:04.823894 [INFO] manager: shutting down
TestDNS_Compression_Query - 2019/12/30 18:55:04.825376 [INFO] agent: consul server down
TestDNS_Compression_Query - 2019/12/30 18:55:04.825441 [INFO] agent: shutdown complete
TestDNS_Compression_Query - 2019/12/30 18:55:04.825501 [INFO] agent: Stopping DNS server 127.0.0.1:18221 (tcp)
TestDNS_Compression_Query - 2019/12/30 18:55:04.825648 [INFO] agent: Stopping DNS server 127.0.0.1:18221 (udp)
TestDNS_Compression_Query - 2019/12/30 18:55:04.825810 [INFO] agent: Stopping HTTP server 127.0.0.1:18222 (tcp)
TestDNS_Compression_Query - 2019/12/30 18:55:04.826019 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_Query - 2019/12/30 18:55:04.826105 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_Query (5.89s)
=== CONT  TestDNS_AltDomains_SOA
TestDNS_Compression_Query - 2019/12/30 18:55:04.829248 [ERR] connect: Apply failed raft is already shutdown
TestDNS_Compression_Query - 2019/12/30 18:55:04.829463 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AltDomains_SOA - 2019/12/30 18:55:05.006276 [DEBUG] tlsutil: Update with version 1
TestDNS_AltDomains_SOA - 2019/12/30 18:55:05.010984 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a3fce0e2-fc19-d305-0660-233fcb0206ba Address:127.0.0.1:18250}]
2019/12/30 18:55:05 [INFO]  raft: Node at 127.0.0.1:18250 [Follower] entering Follower state (Leader: "")
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.083212 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.087278 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.088642 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.088905 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:18250) (DC: dc1)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.089789 [INFO] agent: Started DNS server 127.0.0.1:18245 (udp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.090269 [INFO] agent: Started DNS server 127.0.0.1:18245 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.093573 [INFO] agent: Started HTTP server on 127.0.0.1:18246 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:05.093660 [INFO] agent: started state syncer
2019/12/30 18:55:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:05 [INFO]  raft: Node at 127.0.0.1:18250 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:05 [INFO]  raft: Node at 127.0.0.1:18244 [Leader] entering Leader state
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:05.655151 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:05.655651 [INFO] consul: New leader elected: Node 1e34a2e5-4bbf-2b9c-4502-c390494eba6b
TestDNS_InvalidQueries - 2019/12/30 18:55:05.894023 [INFO] agent: Synced node info
TestDNS_InvalidQueries - 2019/12/30 18:55:05.903958 [WARN] dns: QName invalid: 
TestDNS_InvalidQueries - 2019/12/30 18:55:05.904458 [DEBUG] dns: request for name consul. type SRV class IN (took 358.009µs) from client 127.0.0.1:36107 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:05.905186 [WARN] dns: QName invalid: node.
TestDNS_InvalidQueries - 2019/12/30 18:55:05.905525 [DEBUG] dns: request for name node.consul. type SRV class IN (took 309.008µs) from client 127.0.0.1:56620 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:05.906352 [WARN] dns: QName invalid: service.
TestDNS_InvalidQueries - 2019/12/30 18:55:05.906689 [DEBUG] dns: request for name service.consul. type SRV class IN (took 305.008µs) from client 127.0.0.1:37972 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:05.907319 [WARN] dns: QName invalid: query.
TestDNS_InvalidQueries - 2019/12/30 18:55:05.907630 [DEBUG] dns: request for name query.consul. type SRV class IN (took 277.34µs) from client 127.0.0.1:39069 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:05.908288 [WARN] dns: QName invalid: foo.node.dc1.extra.
TestDNS_InvalidQueries - 2019/12/30 18:55:05.908622 [DEBUG] dns: request for name foo.node.dc1.extra.consul. type SRV class IN (took 319.675µs) from client 127.0.0.1:35057 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:05.909313 [WARN] dns: QName invalid: foo.service.dc1.extra.
TestDNS_InvalidQueries - 2019/12/30 18:55:05.910698 [DEBUG] dns: request for name foo.service.dc1.extra.consul. type SRV class IN (took 1.339035ms) from client 127.0.0.1:55085 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:05.911380 [WARN] dns: QName invalid: foo.query.dc1.extra.
TestDNS_InvalidQueries - 2019/12/30 18:55:05.911725 [INFO] agent: Requesting shutdown
TestDNS_InvalidQueries - 2019/12/30 18:55:05.911819 [INFO] consul: shutting down server
TestDNS_InvalidQueries - 2019/12/30 18:55:05.911872 [WARN] serf: Shutdown without a Leave
TestDNS_InvalidQueries - 2019/12/30 18:55:05.911756 [DEBUG] dns: request for name foo.query.dc1.extra.consul. type SRV class IN (took 442.012µs) from client 127.0.0.1:42186 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:06.021494 [WARN] serf: Shutdown without a Leave
TestDNS_InvalidQueries - 2019/12/30 18:55:06.166265 [INFO] manager: shutting down
2019/12/30 18:55:06 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:06 [INFO]  raft: Node at 127.0.0.1:18250 [Leader] entering Leader state
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167078 [INFO] agent: consul server down
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167137 [INFO] agent: shutdown complete
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167205 [INFO] agent: Stopping DNS server 127.0.0.1:18233 (tcp)
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167373 [INFO] agent: Stopping DNS server 127.0.0.1:18233 (udp)
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167545 [INFO] agent: Stopping HTTP server 127.0.0.1:18234 (tcp)
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167853 [INFO] agent: Waiting for endpoints to shut down
TestDNS_InvalidQueries - 2019/12/30 18:55:06.167937 [INFO] agent: Endpoints down
--- PASS: TestDNS_InvalidQueries (3.69s)
=== CONT  TestDNS_AltDomains_Service
TestDNS_InvalidQueries - 2019/12/30 18:55:06.171758 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/30 18:55:06.172072 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/30 18:55:06.172146 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/30 18:55:06.172194 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/30 18:55:06.172236 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.173454 [INFO] consul: cluster leadership acquired
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.173854 [INFO] consul: New leader elected: test-node
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AltDomains_Service - 2019/12/30 18:55:06.224368 [WARN] agent: Node name "Node 74885cfc-c635-7217-46a2-d822f9987c0b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_AltDomains_Service - 2019/12/30 18:55:06.224838 [DEBUG] tlsutil: Update with version 1
TestDNS_AltDomains_Service - 2019/12/30 18:55:06.226895 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.335164 [INFO] agent: Synced node info
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.335277 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.342135 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.343525 [WARN] dns: Query results too stale, re-requesting
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.343961 [DEBUG] dns: request for name nope.query.consul. type SRV class IN (took 582.016µs) from client 127.0.0.1:48847 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.344234 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.344373 [INFO] consul: shutting down server
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.344584 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.466520 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2b499938-5d14-3fe6-cae3-20eb24c3ab0e Address:127.0.0.1:18256}]
2019/12/30 18:55:06 [INFO]  raft: Node at 127.0.0.1:18256 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:06.622328 [INFO] manager: shutting down
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.623882 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.628601 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.630128 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:18256) (DC: dc1)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.630678 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.632499 [INFO] agent: Started DNS server 127.0.0.1:18251 (tcp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.632579 [INFO] agent: Started DNS server 127.0.0.1:18251 (udp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.635449 [INFO] agent: Started HTTP server on 127.0.0.1:18252 (tcp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:06.635538 [INFO] agent: started state syncer
2019/12/30 18:55:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:06 [INFO]  raft: Node at 127.0.0.1:18256 [Candidate] entering Candidate state in term 2
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.845720 [INFO] agent: Synced node info
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.863692 [DEBUG] dns: request for name test-node.node.consul. type A class IN (took 609.35µs) from client 127.0.0.1:34453 (udp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.869638 [DEBUG] dns: request for name test-node.node.test.consul. type A class IN (took 707.018µs) from client 127.0.0.1:57072 (udp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.870982 [DEBUG] dns: request for name test-node.node.dc1.consul. type A class IN (took 594.682µs) from client 127.0.0.1:43318 (udp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.872085 [DEBUG] dns: request for name test-node.node.dc1.test.consul. type A class IN (took 486.68µs) from client 127.0.0.1:32976 (udp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.872184 [INFO] agent: Requesting shutdown
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.872253 [INFO] consul: shutting down server
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:06.872305 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.066689 [INFO] agent: consul server down
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.066820 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.066889 [INFO] agent: Stopping DNS server 127.0.0.1:18239 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.067069 [INFO] agent: Stopping DNS server 127.0.0.1:18239 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.067255 [INFO] agent: Stopping HTTP server 127.0.0.1:18240 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.067501 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.067576 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_AllowStale (3.72s)
=== CONT  TestDNS_NonExistingLookupEmptyAorAAAA
TestDNS_PreparedQuery_AllowStale - 2019/12/30 18:55:07.070064 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.071689 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:07.123496 [WARN] agent: Node name "Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:07.123887 [DEBUG] tlsutil: Update with version 1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:07.126340 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.266347 [INFO] manager: shutting down
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.527578 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.527824 [INFO] agent: consul server down
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.527872 [INFO] agent: shutdown complete
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.527923 [INFO] agent: Stopping DNS server 127.0.0.1:18245 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.528062 [INFO] agent: Stopping DNS server 127.0.0.1:18245 (udp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.528228 [INFO] agent: Stopping HTTP server 127.0.0.1:18246 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.528463 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AltDomains_Overlap - 2019/12/30 18:55:07.528502 [INFO] agent: Endpoints down
--- PASS: TestDNS_AltDomains_Overlap (3.67s)
=== CONT  TestDNS_NonExistingLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NonExistingLookup - 2019/12/30 18:55:07.589005 [WARN] agent: Node name "Node 997f95f3-0149-3c69-7558-b5fa0bb7c22f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NonExistingLookup - 2019/12/30 18:55:07.589621 [DEBUG] tlsutil: Update with version 1
TestDNS_NonExistingLookup - 2019/12/30 18:55:07.591830 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:07 [INFO]  raft: Node at 127.0.0.1:18256 [Leader] entering Leader state
TestDNS_AltDomains_SOA - 2019/12/30 18:55:07.802216 [INFO] consul: cluster leadership acquired
TestDNS_AltDomains_SOA - 2019/12/30 18:55:07.802618 [INFO] consul: New leader elected: test-node
2019/12/30 18:55:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:74885cfc-c635-7217-46a2-d822f9987c0b Address:127.0.0.1:18262}]
2019/12/30 18:55:08 [INFO]  raft: Node at 127.0.0.1:18262 [Follower] entering Follower state (Leader: "")
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.095588 [INFO] serf: EventMemberJoin: Node 74885cfc-c635-7217-46a2-d822f9987c0b.dc1 127.0.0.1
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.102320 [INFO] serf: EventMemberJoin: Node 74885cfc-c635-7217-46a2-d822f9987c0b 127.0.0.1
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.103458 [INFO] consul: Adding LAN server Node 74885cfc-c635-7217-46a2-d822f9987c0b (Addr: tcp/127.0.0.1:18262) (DC: dc1)
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.103810 [INFO] consul: Handled member-join event for server "Node 74885cfc-c635-7217-46a2-d822f9987c0b.dc1" in area "wan"
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.104694 [INFO] agent: Started DNS server 127.0.0.1:18257 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.104802 [INFO] agent: Started DNS server 127.0.0.1:18257 (tcp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.107365 [INFO] agent: Started HTTP server on 127.0.0.1:18258 (tcp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:08.107462 [INFO] agent: started state syncer
2019/12/30 18:55:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:08 [INFO]  raft: Node at 127.0.0.1:18262 [Candidate] entering Candidate state in term 2
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.376216 [INFO] agent: Synced node info
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.376332 [DEBUG] agent: Node info in sync
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.386906 [WARN] dns: no servers found
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.387276 [DEBUG] dns: request for name test-node.node.consul. type SOA class IN (took 794.354µs) from client 127.0.0.1:55403 (udp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.388932 [WARN] dns: no servers found
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.389210 [DEBUG] dns: request for name test-node.node.test-domain. type SOA class IN (took 718.019µs) from client 127.0.0.1:60143 (udp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.389378 [INFO] agent: Requesting shutdown
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.389492 [INFO] consul: shutting down server
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.389531 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.533551 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_SOA - 2019/12/30 18:55:08.775514 [INFO] manager: shutting down
2019/12/30 18:55:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:526e6a9b-aed2-b7ef-3a10-eed6c9f46889 Address:127.0.0.1:18268}]
2019/12/30 18:55:08 [INFO]  raft: Node at 127.0.0.1:18268 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.783159 [INFO] serf: EventMemberJoin: Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889.dc1 127.0.0.1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.788318 [INFO] serf: EventMemberJoin: Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889 127.0.0.1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.789890 [INFO] consul: Adding LAN server Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889 (Addr: tcp/127.0.0.1:18268) (DC: dc1)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.789999 [INFO] consul: Handled member-join event for server "Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889.dc1" in area "wan"
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.791431 [INFO] agent: Started DNS server 127.0.0.1:18263 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.794875 [INFO] agent: Started DNS server 127.0.0.1:18263 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.797530 [INFO] agent: Started HTTP server on 127.0.0.1:18264 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:08.797618 [INFO] agent: started state syncer
2019/12/30 18:55:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:08 [INFO]  raft: Node at 127.0.0.1:18268 [Candidate] entering Candidate state in term 2
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.050752 [INFO] agent: consul server down
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.050836 [INFO] agent: shutdown complete
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.050899 [INFO] agent: Stopping DNS server 127.0.0.1:18251 (tcp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.051036 [INFO] agent: Stopping DNS server 127.0.0.1:18251 (udp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.051200 [INFO] agent: Stopping HTTP server 127.0.0.1:18252 (tcp)
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.051403 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.051477 [INFO] agent: Endpoints down
--- PASS: TestDNS_AltDomains_SOA (4.23s)
=== CONT  TestDNS_AddressLookup
TestDNS_AltDomains_SOA - 2019/12/30 18:55:09.052207 [ERR] consul: failed to establish leadership: leadership lost while committing log
2019/12/30 18:55:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:997f95f3-0149-3c69-7558-b5fa0bb7c22f Address:127.0.0.1:18274}]
2019/12/30 18:55:09 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:09 [INFO]  raft: Node at 127.0.0.1:18262 [Leader] entering Leader state
2019/12/30 18:55:09 [INFO]  raft: Node at 127.0.0.1:18274 [Follower] entering Follower state (Leader: "")
TestDNS_AltDomains_Service - 2019/12/30 18:55:09.173303 [INFO] consul: cluster leadership acquired
TestDNS_AltDomains_Service - 2019/12/30 18:55:09.173745 [INFO] consul: New leader elected: Node 74885cfc-c635-7217-46a2-d822f9987c0b
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.181717 [INFO] serf: EventMemberJoin: Node 997f95f3-0149-3c69-7558-b5fa0bb7c22f.dc1 127.0.0.1
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.187888 [INFO] serf: EventMemberJoin: Node 997f95f3-0149-3c69-7558-b5fa0bb7c22f 127.0.0.1
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.190503 [INFO] agent: Started DNS server 127.0.0.1:18269 (udp)
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.193364 [INFO] agent: Started DNS server 127.0.0.1:18269 (tcp)
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.191314 [INFO] consul: Adding LAN server Node 997f95f3-0149-3c69-7558-b5fa0bb7c22f (Addr: tcp/127.0.0.1:18274) (DC: dc1)
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.191335 [INFO] consul: Handled member-join event for server "Node 997f95f3-0149-3c69-7558-b5fa0bb7c22f.dc1" in area "wan"
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.203183 [INFO] agent: Started HTTP server on 127.0.0.1:18270 (tcp)
TestDNS_NonExistingLookup - 2019/12/30 18:55:09.203445 [INFO] agent: started state syncer
2019/12/30 18:55:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
WARNING: bootstrap = true: do not enable unless necessary
2019/12/30 18:55:09 [INFO]  raft: Node at 127.0.0.1:18274 [Candidate] entering Candidate state in term 2
TestDNS_AddressLookup - 2019/12/30 18:55:09.219570 [WARN] agent: Node name "Node d429eed6-5076-f270-1dae-a35451e8da88" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_AddressLookup - 2019/12/30 18:55:09.221268 [DEBUG] tlsutil: Update with version 1
TestDNS_AddressLookup - 2019/12/30 18:55:09.226798 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:09 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:09 [INFO]  raft: Node at 127.0.0.1:18268 [Leader] entering Leader state
TestDNS_AltDomains_Service - 2019/12/30 18:55:09.828256 [INFO] agent: Synced node info
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:09.828757 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:09.829793 [INFO] consul: New leader elected: Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889
2019/12/30 18:55:10 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:10 [INFO]  raft: Node at 127.0.0.1:18274 [Leader] entering Leader state
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.266746 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.267247 [INFO] consul: New leader elected: Node 997f95f3-0149-3c69-7558-b5fa0bb7c22f
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:10.383956 [INFO] agent: Synced node info
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.396973 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 5.968158ms) from client 127.0.0.1:55586 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.397183 [DEBUG] dns: request for name db.service.test-domain. type SRV class IN (took 872.357µs) from client 127.0.0.1:34411 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.399181 [DEBUG] dns: request for name db.service.dc1.consul. type SRV class IN (took 1.02736ms) from client 127.0.0.1:43085 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.401154 [DEBUG] dns: request for name db.service.dc1.test-domain. type SRV class IN (took 813.688µs) from client 127.0.0.1:40709 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.402102 [INFO] agent: Requesting shutdown
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.402365 [INFO] consul: shutting down server
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.402651 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.528922 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.668388 [INFO] manager: shutting down
2019/12/30 18:55:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d429eed6-5076-f270-1dae-a35451e8da88 Address:127.0.0.1:18280}]
2019/12/30 18:55:10 [INFO]  raft: Node at 127.0.0.1:18280 [Follower] entering Follower state (Leader: "")
TestDNS_AddressLookup - 2019/12/30 18:55:10.805216 [INFO] serf: EventMemberJoin: Node d429eed6-5076-f270-1dae-a35451e8da88.dc1 127.0.0.1
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.805225 [INFO] agent: Synced node info
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.807806 [INFO] agent: consul server down
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.807885 [INFO] agent: shutdown complete
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.807955 [INFO] agent: Stopping DNS server 127.0.0.1:18257 (tcp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.808116 [INFO] agent: Stopping DNS server 127.0.0.1:18257 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.808296 [INFO] agent: Stopping HTTP server 127.0.0.1:18258 (tcp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.808534 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.808621 [INFO] agent: Endpoints down
--- PASS: TestDNS_AltDomains_Service (4.64s)
=== CONT  TestDNS_ServiceLookup_FilterACL
--- PASS: TestDNS_ServiceLookup_FilterACL (0.00s)
=== CONT  TestDNS_ServiceLookup_SRV_RFC_TCP_Default
TestDNS_AddressLookup - 2019/12/30 18:55:10.809226 [INFO] serf: EventMemberJoin: Node d429eed6-5076-f270-1dae-a35451e8da88 127.0.0.1
TestDNS_AddressLookup - 2019/12/30 18:55:10.811315 [INFO] agent: Started DNS server 127.0.0.1:18275 (udp)
TestDNS_AddressLookup - 2019/12/30 18:55:10.811921 [INFO] consul: Adding LAN server Node d429eed6-5076-f270-1dae-a35451e8da88 (Addr: tcp/127.0.0.1:18280) (DC: dc1)
TestDNS_AddressLookup - 2019/12/30 18:55:10.812199 [INFO] consul: Handled member-join event for server "Node d429eed6-5076-f270-1dae-a35451e8da88.dc1" in area "wan"
TestDNS_AddressLookup - 2019/12/30 18:55:10.815625 [INFO] agent: Started DNS server 127.0.0.1:18275 (tcp)
TestDNS_AddressLookup - 2019/12/30 18:55:10.818120 [INFO] agent: Started HTTP server on 127.0.0.1:18276 (tcp)
TestDNS_AddressLookup - 2019/12/30 18:55:10.818221 [INFO] agent: started state syncer
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.819142 [WARN] dns: QName invalid: nonexisting.
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.819636 [DEBUG] dns: request for name nonexisting.consul. type ANY class IN (took 451.678µs) from client 127.0.0.1:54929 (udp)
TestDNS_AltDomains_Service - 2019/12/30 18:55:10.820621 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.820728 [INFO] agent: Requesting shutdown
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.820794 [INFO] consul: shutting down server
TestDNS_NonExistingLookup - 2019/12/30 18:55:10.820846 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:10 [INFO]  raft: Node at 127.0.0.1:18280 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:10.872079 [WARN] agent: Node name "Node 1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:10.872507 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:10.874673 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.032958 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.116357 [INFO] manager: shutting down
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.124171 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.126147 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.128379 [INFO] agent: consul server down
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.128695 [INFO] agent: shutdown complete
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.129099 [INFO] agent: Stopping DNS server 127.0.0.1:18269 (tcp)
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.129614 [INFO] agent: Stopping DNS server 127.0.0.1:18269 (udp)
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.129790 [INFO] agent: Stopping HTTP server 127.0.0.1:18270 (tcp)
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.130038 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NonExistingLookup - 2019/12/30 18:55:11.130125 [INFO] agent: Endpoints down
--- PASS: TestDNS_NonExistingLookup (3.60s)
=== CONT  TestDNS_ServiceLookup_SRV_RFC
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:11.191813 [WARN] agent: Node name "Node 7658a0bb-efbf-a8a5-a0ab-0685ff485276" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:11.192234 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:11.194808 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:11 [INFO]  raft: Node at 127.0.0.1:18280 [Leader] entering Leader state
TestDNS_AddressLookup - 2019/12/30 18:55:11.620374 [INFO] consul: cluster leadership acquired
TestDNS_AddressLookup - 2019/12/30 18:55:11.621006 [INFO] consul: New leader elected: Node d429eed6-5076-f270-1dae-a35451e8da88
TestDNS_AddressLookup - 2019/12/30 18:55:12.234002 [INFO] agent: Synced node info
2019/12/30 18:55:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3 Address:127.0.0.1:18286}]
TestDNS_AddressLookup - 2019/12/30 18:55:12.241174 [DEBUG] dns: request for name 7f000001.addr.dc1.consul. type SRV class IN (took 440.345µs) from client 127.0.0.1:38906 (udp)
TestDNS_AddressLookup - 2019/12/30 18:55:12.241627 [INFO] agent: Requesting shutdown
TestDNS_AddressLookup - 2019/12/30 18:55:12.241714 [INFO] consul: shutting down server
TestDNS_AddressLookup - 2019/12/30 18:55:12.241767 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:12 [INFO]  raft: Node at 127.0.0.1:18286 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.250940 [INFO] serf: EventMemberJoin: Node 1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3.dc1 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.257456 [INFO] serf: EventMemberJoin: Node 1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.258368 [INFO] consul: Adding LAN server Node 1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3 (Addr: tcp/127.0.0.1:18286) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.259065 [INFO] consul: Handled member-join event for server "Node 1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3.dc1" in area "wan"
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.262482 [INFO] agent: Started DNS server 127.0.0.1:18281 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.263311 [INFO] agent: Started DNS server 127.0.0.1:18281 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.267345 [INFO] agent: Started HTTP server on 127.0.0.1:18282 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:12.288151 [INFO] agent: started state syncer
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:12.296255 [DEBUG] agent: Node info in sync
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:12.296360 [DEBUG] agent: Node info in sync
2019/12/30 18:55:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:12 [INFO]  raft: Node at 127.0.0.1:18286 [Candidate] entering Candidate state in term 2
TestDNS_AddressLookup - 2019/12/30 18:55:12.396875 [WARN] serf: Shutdown without a Leave
TestDNS_AddressLookup - 2019/12/30 18:55:12.591757 [INFO] manager: shutting down
TestDNS_AddressLookup - 2019/12/30 18:55:12.595310 [INFO] agent: consul server down
TestDNS_AddressLookup - 2019/12/30 18:55:12.595405 [INFO] agent: shutdown complete
TestDNS_AddressLookup - 2019/12/30 18:55:12.595471 [INFO] agent: Stopping DNS server 127.0.0.1:18275 (tcp)
TestDNS_AddressLookup - 2019/12/30 18:55:12.595686 [INFO] agent: Stopping DNS server 127.0.0.1:18275 (udp)
TestDNS_AddressLookup - 2019/12/30 18:55:12.595791 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_AddressLookup - 2019/12/30 18:55:12.595883 [INFO] agent: Stopping HTTP server 127.0.0.1:18276 (tcp)
TestDNS_AddressLookup - 2019/12/30 18:55:12.596073 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_AddressLookup - 2019/12/30 18:55:12.596106 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AddressLookup - 2019/12/30 18:55:12.596196 [INFO] agent: Endpoints down
TestDNS_AddressLookup - 2019/12/30 18:55:12.596267 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
--- PASS: TestDNS_AddressLookup (3.54s)
TestDNS_AddressLookup - 2019/12/30 18:55:12.596326 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
=== CONT  TestDNS_PreparedQuery_TTL
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:12.667540 [WARN] agent: Node name "Node 1923c24d-bc31-8fea-babf-0257fe599d50" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:12.667971 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:12.670179 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7658a0bb-efbf-a8a5-a0ab-0685ff485276 Address:127.0.0.1:18292}]
2019/12/30 18:55:12 [INFO]  raft: Node at 127.0.0.1:18292 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.756431 [INFO] serf: EventMemberJoin: Node 7658a0bb-efbf-a8a5-a0ab-0685ff485276.dc1 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.762159 [INFO] serf: EventMemberJoin: Node 7658a0bb-efbf-a8a5-a0ab-0685ff485276 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.763935 [INFO] consul: Adding LAN server Node 7658a0bb-efbf-a8a5-a0ab-0685ff485276 (Addr: tcp/127.0.0.1:18292) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.764925 [INFO] consul: Handled member-join event for server "Node 7658a0bb-efbf-a8a5-a0ab-0685ff485276.dc1" in area "wan"
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.768427 [INFO] agent: Started DNS server 127.0.0.1:18287 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.768611 [INFO] agent: Started DNS server 127.0.0.1:18287 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.772346 [INFO] agent: Started HTTP server on 127.0.0.1:18288 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:12.772468 [INFO] agent: started state syncer
2019/12/30 18:55:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:12 [INFO]  raft: Node at 127.0.0.1:18292 [Candidate] entering Candidate state in term 2
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:12.895517 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:13 [INFO]  raft: Node at 127.0.0.1:18286 [Leader] entering Leader state
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.136115 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.136822 [DEBUG] consul: Skipping self join check for "Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889" since the cluster is too small
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.137121 [INFO] consul: member 'Node 526e6a9b-aed2-b7ef-3a10-eed6c9f46889' joined, marking health alive
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.140530 [DEBUG] dns: request for name webv4.service.consul. type AAAA class IN (took 1.164031ms) from client 127.0.0.1:44088 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:13.143886 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:13.144624 [INFO] consul: New leader elected: Node 1b0fa7c2-2bd0-cabd-6b40-876f4a7d73b3
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.147378 [DEBUG] dns: request for name webv4.query.consul. type AAAA class IN (took 1.01636ms) from client 127.0.0.1:54618 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.149322 [DEBUG] dns: request for name webv6.service.consul. type A class IN (took 921.358µs) from client 127.0.0.1:33538 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.151540 [DEBUG] dns: request for name webv6.query.consul. type A class IN (took 1.00336ms) from client 127.0.0.1:54473 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.153489 [INFO] agent: Requesting shutdown
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.153837 [INFO] consul: shutting down server
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.154011 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.263485 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.396934 [INFO] manager: shutting down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.400812 [INFO] agent: consul server down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.400887 [INFO] agent: shutdown complete
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.400942 [INFO] agent: Stopping DNS server 127.0.0.1:18263 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.401133 [INFO] agent: Stopping DNS server 127.0.0.1:18263 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.401312 [INFO] agent: Stopping HTTP server 127.0.0.1:18264 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.401581 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/30 18:55:13.401666 [INFO] agent: Endpoints down
--- PASS: TestDNS_NonExistingLookupEmptyAorAAAA (6.33s)
=== CONT  TestDNS_ServiceLookup_TTL
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:13.481949 [WARN] agent: Node name "Node e22762e3-c5bb-0dfe-1ed5-6f225555ccc4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:13.482367 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:13.494923 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:13 [INFO]  raft: Node at 127.0.0.1:18292 [Leader] entering Leader state
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:13.635123 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:13.635658 [INFO] consul: New leader elected: Node 7658a0bb-efbf-a8a5-a0ab-0685ff485276
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:13.640274 [INFO] agent: Synced node info
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:13.881948 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:13.882063 [DEBUG] agent: Node info in sync
2019/12/30 18:55:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1923c24d-bc31-8fea-babf-0257fe599d50 Address:127.0.0.1:18298}]
2019/12/30 18:55:13 [INFO]  raft: Node at 127.0.0.1:18298 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.930852 [INFO] serf: EventMemberJoin: Node 1923c24d-bc31-8fea-babf-0257fe599d50.dc1 127.0.0.1
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.934020 [INFO] serf: EventMemberJoin: Node 1923c24d-bc31-8fea-babf-0257fe599d50 127.0.0.1
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.934957 [INFO] consul: Adding LAN server Node 1923c24d-bc31-8fea-babf-0257fe599d50 (Addr: tcp/127.0.0.1:18298) (DC: dc1)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.935283 [INFO] consul: Handled member-join event for server "Node 1923c24d-bc31-8fea-babf-0257fe599d50.dc1" in area "wan"
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.939989 [INFO] agent: Started DNS server 127.0.0.1:18293 (udp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.940280 [INFO] agent: Started DNS server 127.0.0.1:18293 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.946658 [INFO] agent: Started HTTP server on 127.0.0.1:18294 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:13.946815 [INFO] agent: started state syncer
2019/12/30 18:55:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:13 [INFO]  raft: Node at 127.0.0.1:18298 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.066997 [DEBUG] dns: request for name _db._tcp.service.dc1.consul. type SRV class IN (took 916.358µs) from client 127.0.0.1:57657 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.069768 [DEBUG] dns: request for name _db._tcp.service.consul. type SRV class IN (took 789.021µs) from client 127.0.0.1:46780 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.071622 [DEBUG] dns: request for name _db._tcp.dc1.consul. type SRV class IN (took 779.02µs) from client 127.0.0.1:40653 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.073612 [DEBUG] dns: request for name _db._tcp.consul. type SRV class IN (took 822.356µs) from client 127.0.0.1:45425 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.073695 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.073828 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.073963 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.153129 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.266452 [INFO] manager: shutting down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.402418 [INFO] agent: consul server down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.402500 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.402567 [INFO] agent: Stopping DNS server 127.0.0.1:18281 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.402765 [INFO] agent: Stopping DNS server 127.0.0.1:18281 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.402966 [INFO] agent: Stopping HTTP server 127.0.0.1:18282 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.403199 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.403276 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SRV_RFC_TCP_Default (3.59s)
=== CONT  TestDNS_NodeLookup_TTL
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/30 18:55:14.404232 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:14.514642 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:14.594679 [WARN] agent: Node name "Node bac700ea-11c1-ef19-870d-c4da1e593b4c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:14.595360 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:14.599118 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e22762e3-c5bb-0dfe-1ed5-6f225555ccc4 Address:127.0.0.1:18304}]
2019/12/30 18:55:14 [INFO]  raft: Node at 127.0.0.1:18304 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.625885 [INFO] serf: EventMemberJoin: Node e22762e3-c5bb-0dfe-1ed5-6f225555ccc4.dc1 127.0.0.1
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.629101 [INFO] serf: EventMemberJoin: Node e22762e3-c5bb-0dfe-1ed5-6f225555ccc4 127.0.0.1
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.629865 [INFO] consul: Adding LAN server Node e22762e3-c5bb-0dfe-1ed5-6f225555ccc4 (Addr: tcp/127.0.0.1:18304) (DC: dc1)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.630497 [INFO] consul: Handled member-join event for server "Node e22762e3-c5bb-0dfe-1ed5-6f225555ccc4.dc1" in area "wan"
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.631660 [INFO] agent: Started DNS server 127.0.0.1:18299 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.631978 [INFO] agent: Started DNS server 127.0.0.1:18299 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.634620 [INFO] agent: Started HTTP server on 127.0.0.1:18300 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:14.634722 [INFO] agent: started state syncer
2019/12/30 18:55:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:14 [INFO]  raft: Node at 127.0.0.1:18304 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:14 [INFO]  raft: Node at 127.0.0.1:18298 [Leader] entering Leader state
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:14.814299 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:14.815387 [INFO] consul: New leader elected: Node 1923c24d-bc31-8fea-babf-0257fe599d50
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.311551 [DEBUG] dns: request for name _db._master.service.dc1.consul. type SRV class IN (took 670.685µs) from client 127.0.0.1:56067 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.314710 [DEBUG] dns: request for name _db._master.service.consul. type SRV class IN (took 741.353µs) from client 127.0.0.1:38869 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.316085 [DEBUG] dns: request for name _db._master.dc1.consul. type SRV class IN (took 573.348µs) from client 127.0.0.1:38847 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.326190 [DEBUG] dns: request for name _db._master.consul. type SRV class IN (took 676.018µs) from client 127.0.0.1:54382 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.326281 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.326350 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.326399 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.399781 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:15.400915 [INFO] agent: Synced node info
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:15.401042 [DEBUG] agent: Node info in sync
2019/12/30 18:55:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:15 [INFO]  raft: Node at 127.0.0.1:18304 [Leader] entering Leader state
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:15.493647 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:15.494058 [INFO] consul: New leader elected: Node e22762e3-c5bb-0dfe-1ed5-6f225555ccc4
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.495881 [INFO] manager: shutting down
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.496361 [INFO] agent: consul server down
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.496419 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.496472 [INFO] agent: Stopping DNS server 127.0.0.1:18287 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.496611 [INFO] agent: Stopping DNS server 127.0.0.1:18287 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.496754 [INFO] agent: Stopping HTTP server 127.0.0.1:18288 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.496936 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.497001 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SRV_RFC (4.37s)
=== CONT  TestDNS_ServiceLookup_AnswerLimits
--- PASS: TestDNS_ServiceLookup_AnswerLimits (0.00s)
=== CONT  TestDNS_ServiceLookup_LargeResponses
TestDNS_ServiceLookup_SRV_RFC - 2019/12/30 18:55:15.517005 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
2019/12/30 18:55:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bac700ea-11c1-ef19-870d-c4da1e593b4c Address:127.0.0.1:18310}]
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.572162 [INFO] serf: EventMemberJoin: Node bac700ea-11c1-ef19-870d-c4da1e593b4c.dc1 127.0.0.1
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.575679 [INFO] serf: EventMemberJoin: Node bac700ea-11c1-ef19-870d-c4da1e593b4c 127.0.0.1
2019/12/30 18:55:15 [INFO]  raft: Node at 127.0.0.1:18310 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.587357 [INFO] consul: Adding LAN server Node bac700ea-11c1-ef19-870d-c4da1e593b4c (Addr: tcp/127.0.0.1:18310) (DC: dc1)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.588803 [INFO] consul: Handled member-join event for server "Node bac700ea-11c1-ef19-870d-c4da1e593b4c.dc1" in area "wan"
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.591625 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.592279 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.593068 [INFO] agent: Started DNS server 127.0.0.1:18305 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.594561 [INFO] agent: Started DNS server 127.0.0.1:18305 (udp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.602208 [INFO] agent: Started HTTP server on 127.0.0.1:18306 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:15.602337 [INFO] agent: started state syncer
2019/12/30 18:55:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:15 [INFO]  raft: Node at 127.0.0.1:18310 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:15.664556 [WARN] agent: Node name "Node d64773c1-0905-7311-3f0b-8a2470d14273" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:15.665986 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:15.670853 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:15.819688 [INFO] agent: Synced node info
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:15.819806 [DEBUG] agent: Node info in sync
2019/12/30 18:55:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:16 [INFO]  raft: Node at 127.0.0.1:18310 [Leader] entering Leader state
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:16.304471 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:16.304886 [INFO] consul: New leader elected: Node bac700ea-11c1-ef19-870d-c4da1e593b4c
2019/12/30 18:55:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d64773c1-0905-7311-3f0b-8a2470d14273 Address:127.0.0.1:18316}]
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:16.733941 [INFO] agent: Synced node info
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.737311 [INFO] serf: EventMemberJoin: Node d64773c1-0905-7311-3f0b-8a2470d14273.dc1 127.0.0.1
2019/12/30 18:55:16 [INFO]  raft: Node at 127.0.0.1:18316 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.747767 [INFO] serf: EventMemberJoin: Node d64773c1-0905-7311-3f0b-8a2470d14273 127.0.0.1
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.749449 [INFO] consul: Adding LAN server Node d64773c1-0905-7311-3f0b-8a2470d14273 (Addr: tcp/127.0.0.1:18316) (DC: dc1)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.750263 [INFO] consul: Handled member-join event for server "Node d64773c1-0905-7311-3f0b-8a2470d14273.dc1" in area "wan"
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.752202 [INFO] agent: Started DNS server 127.0.0.1:18311 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.752887 [INFO] agent: Started DNS server 127.0.0.1:18311 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.756622 [INFO] agent: Started HTTP server on 127.0.0.1:18312 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:16.757129 [INFO] agent: started state syncer
2019/12/30 18:55:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:16 [INFO]  raft: Node at 127.0.0.1:18316 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:16.984184 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:16.993384 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:16.997952 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:17.141582 [DEBUG] consul: Skipping self join check for "Node 1923c24d-bc31-8fea-babf-0257fe599d50" since the cluster is too small
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:17.141996 [INFO] consul: member 'Node 1923c24d-bc31-8fea-babf-0257fe599d50' joined, marking health alive
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.148350 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.161917 [INFO] consul: shutting down server
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.162190 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.330250 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:17.332166 [DEBUG] dns: request for name foo.node.consul. type ANY class IN (took 527.68µs) from client 127.0.0.1:36647 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.408189 [INFO] manager: shutting down
2019/12/30 18:55:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:17 [INFO]  raft: Node at 127.0.0.1:18316 [Leader] entering Leader state
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:17.413777 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:17.414216 [INFO] consul: New leader elected: Node d64773c1-0905-7311-3f0b-8a2470d14273
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.414952 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415200 [INFO] agent: consul server down
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415249 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415301 [INFO] agent: Stopping DNS server 127.0.0.1:18299 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415446 [INFO] agent: Stopping DNS server 127.0.0.1:18299 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415607 [INFO] agent: Stopping HTTP server 127.0.0.1:18300 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415838 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_TTL - 2019/12/30 18:55:17.415906 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_TTL (4.01s)
=== CONT  TestDNS_ServiceLookup_Truncate
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:17.510266 [WARN] agent: Node name "Node c3319e0f-3a56-c966-773d-9c9467f40729" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:17.510744 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:17.512922 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:17.850575 [INFO] agent: Synced node info
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:17.854330 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 529.681µs) from client 127.0.0.1:34522 (udp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:17.855040 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.151392 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.151859 [DEBUG] consul: Skipping self join check for "Node bac700ea-11c1-ef19-870d-c4da1e593b4c" since the cluster is too small
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.152002 [INFO] consul: member 'Node bac700ea-11c1-ef19-870d-c4da1e593b4c' joined, marking health alive
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.155841 [DEBUG] dns: cname recurse RTT for www.google.com. (859.022µs)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.156139 [DEBUG] dns: request for name google.node.consul. type ANY class IN (took 1.970719ms) from client 127.0.0.1:55166 (udp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.156349 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.156411 [INFO] consul: shutting down server
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.156455 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.234911 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c3319e0f-3a56-c966-773d-9c9467f40729 Address:127.0.0.1:18322}]
2019/12/30 18:55:18 [INFO]  raft: Node at 127.0.0.1:18322 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.310385 [INFO] manager: shutting down
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.310920 [INFO] agent: consul server down
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.311024 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.311129 [INFO] agent: Stopping DNS server 127.0.0.1:18305 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.311346 [INFO] agent: Stopping DNS server 127.0.0.1:18305 (udp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.311547 [INFO] agent: Stopping HTTP server 127.0.0.1:18306 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.311805 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TTL - 2019/12/30 18:55:18.311893 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TTL (3.91s)
=== CONT  TestBinarySearch
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.313264 [INFO] serf: EventMemberJoin: Node c3319e0f-3a56-c966-773d-9c9467f40729.dc1 127.0.0.1
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.316422 [INFO] serf: EventMemberJoin: Node c3319e0f-3a56-c966-773d-9c9467f40729 127.0.0.1
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.317048 [INFO] consul: Adding LAN server Node c3319e0f-3a56-c966-773d-9c9467f40729 (Addr: tcp/127.0.0.1:18322) (DC: dc1)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.317112 [INFO] consul: Handled member-join event for server "Node c3319e0f-3a56-c966-773d-9c9467f40729.dc1" in area "wan"
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.317610 [INFO] agent: Started DNS server 127.0.0.1:18317 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.317692 [INFO] agent: Started DNS server 127.0.0.1:18317 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.320117 [INFO] agent: Started HTTP server on 127.0.0.1:18318 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.320249 [INFO] agent: started state syncer
2019/12/30 18:55:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:18 [INFO]  raft: Node at 127.0.0.1:18322 [Candidate] entering Candidate state in term 2
=== CONT  TestDNS_ServiceLookup_Randomize
--- PASS: TestBinarySearch (0.11s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:18.501657 [WARN] agent: Node name "Node 6762e2ab-3aab-04a3-819d-fd58d2332bda" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:18.502238 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:18.504566 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.776834 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.776947 [INFO] consul: shutting down server
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.776998 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.842525 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.925635 [INFO] manager: shutting down
2019/12/30 18:55:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:18 [INFO]  raft: Node at 127.0.0.1:18322 [Leader] entering Leader state
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.926404 [INFO] agent: consul server down
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.926457 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.926510 [INFO] agent: Stopping DNS server 127.0.0.1:18293 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.926677 [INFO] agent: Stopping DNS server 127.0.0.1:18293 (udp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.926902 [INFO] agent: Stopping HTTP server 127.0.0.1:18294 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.927118 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_TTL - 2019/12/30 18:55:18.927190 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_TTL (6.33s)
=== CONT  TestDNS_ServiceLookup_OnlyPassing
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.928592 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:18.929049 [INFO] consul: New leader elected: Node c3319e0f-3a56-c966-773d-9c9467f40729
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:18.988540 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:18.988691 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:19.028553 [WARN] agent: Node name "Node 9c3deaad-b520-d1db-2b27-504d16b42aa9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:19.029266 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:19.031861 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6762e2ab-3aab-04a3-819d-fd58d2332bda Address:127.0.0.1:18328}]
2019/12/30 18:55:19 [INFO]  raft: Node at 127.0.0.1:18328 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.389026 [INFO] serf: EventMemberJoin: Node 6762e2ab-3aab-04a3-819d-fd58d2332bda.dc1 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.392347 [INFO] serf: EventMemberJoin: Node 6762e2ab-3aab-04a3-819d-fd58d2332bda 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.392900 [INFO] consul: Handled member-join event for server "Node 6762e2ab-3aab-04a3-819d-fd58d2332bda.dc1" in area "wan"
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.393190 [INFO] consul: Adding LAN server Node 6762e2ab-3aab-04a3-819d-fd58d2332bda (Addr: tcp/127.0.0.1:18328) (DC: dc1)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.393443 [INFO] agent: Started DNS server 127.0.0.1:18323 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.404647 [INFO] agent: Started DNS server 127.0.0.1:18323 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.407137 [INFO] agent: Started HTTP server on 127.0.0.1:18324 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:19.407243 [INFO] agent: started state syncer
2019/12/30 18:55:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:19 [INFO]  raft: Node at 127.0.0.1:18328 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:55:19.513723 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:55:19.513814 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:19.820743 [INFO] agent: Synced node info
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:19.820878 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.158760 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.160586 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.161170 [DEBUG] consul: Skipping self join check for "Node d64773c1-0905-7311-3f0b-8a2470d14273" since the cluster is too small
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.161363 [INFO] consul: member 'Node d64773c1-0905-7311-3f0b-8a2470d14273' joined, marking health alive
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.167908 [DEBUG] dns: request for name _this-is-a-very-very-very-very-very-long-name-for-a-service._master.service.consul. type SRV class IN (took 1.170698ms) from client 127.0.0.1:54302 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.175841 [DEBUG] dns: request for name this-is-a-very-very-very-very-very-long-name-for-a-service.query.consul. type SRV class IN (took 1.894717ms) from client 127.0.0.1:47013 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.176058 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.176241 [INFO] consul: shutting down server
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.176301 [WARN] serf: Shutdown without a Leave
2019/12/30 18:55:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:20 [INFO]  raft: Node at 127.0.0.1:18328 [Leader] entering Leader state
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.361782 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:20.363380 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:20.363843 [INFO] consul: New leader elected: Node 6762e2ab-3aab-04a3-819d-fd58d2332bda
2019/12/30 18:55:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9c3deaad-b520-d1db-2b27-504d16b42aa9 Address:127.0.0.1:18334}]
2019/12/30 18:55:20 [INFO]  raft: Node at 127.0.0.1:18334 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.454293 [INFO] serf: EventMemberJoin: Node 9c3deaad-b520-d1db-2b27-504d16b42aa9.dc1 127.0.0.1
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.457971 [INFO] serf: EventMemberJoin: Node 9c3deaad-b520-d1db-2b27-504d16b42aa9 127.0.0.1
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.459528 [INFO] agent: Started DNS server 127.0.0.1:18329 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.460462 [INFO] consul: Handled member-join event for server "Node 9c3deaad-b520-d1db-2b27-504d16b42aa9.dc1" in area "wan"
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.461010 [INFO] agent: Started DNS server 127.0.0.1:18329 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.461631 [INFO] consul: Adding LAN server Node 9c3deaad-b520-d1db-2b27-504d16b42aa9 (Addr: tcp/127.0.0.1:18334) (DC: dc1)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.463325 [INFO] agent: Started HTTP server on 127.0.0.1:18330 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:20.463419 [INFO] agent: started state syncer
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.463654 [INFO] manager: shutting down
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.464361 [INFO] agent: consul server down
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.464471 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.464529 [INFO] agent: Stopping DNS server 127.0.0.1:18311 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.464677 [INFO] agent: Stopping DNS server 127.0.0.1:18311 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.464831 [INFO] agent: Stopping HTTP server 127.0.0.1:18312 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.465050 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_LargeResponses - 2019/12/30 18:55:20.465120 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_LargeResponses (4.97s)
=== CONT  TestDNS_ServiceLookup_OnlyFailing
2019/12/30 18:55:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:20 [INFO]  raft: Node at 127.0.0.1:18334 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:20.533347 [WARN] agent: Node name "Node f4b7d691-1431-932c-b92d-23752b33f979" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:20.533922 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:20.536887 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:20.725888 [INFO] agent: Synced node info
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:20.726027 [DEBUG] agent: Node info in sync
2019/12/30 18:55:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:21 [INFO]  raft: Node at 127.0.0.1:18334 [Leader] entering Leader state
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:21.070621 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:21.071095 [INFO] consul: New leader elected: Node 9c3deaad-b520-d1db-2b27-504d16b42aa9
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:21.843978 [INFO] agent: Synced node info
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:21.844184 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:22.026521 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f4b7d691-1431-932c-b92d-23752b33f979 Address:127.0.0.1:18340}]
2019/12/30 18:55:22 [INFO]  raft: Node at 127.0.0.1:18340 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.194630 [INFO] serf: EventMemberJoin: Node f4b7d691-1431-932c-b92d-23752b33f979.dc1 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.198095 [INFO] serf: EventMemberJoin: Node f4b7d691-1431-932c-b92d-23752b33f979 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.200803 [INFO] consul: Adding LAN server Node f4b7d691-1431-932c-b92d-23752b33f979 (Addr: tcp/127.0.0.1:18340) (DC: dc1)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.201367 [INFO] agent: Started DNS server 127.0.0.1:18335 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.201662 [INFO] consul: Handled member-join event for server "Node f4b7d691-1431-932c-b92d-23752b33f979.dc1" in area "wan"
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.204148 [INFO] agent: Started DNS server 127.0.0.1:18335 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.207061 [INFO] agent: Started HTTP server on 127.0.0.1:18336 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.207166 [INFO] agent: started state syncer
2019/12/30 18:55:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:22 [INFO]  raft: Node at 127.0.0.1:18340 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:22.312853 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:22.669975 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:22.670424 [DEBUG] consul: Skipping self join check for "Node c3319e0f-3a56-c966-773d-9c9467f40729" since the cluster is too small
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:22.670554 [INFO] consul: member 'Node c3319e0f-3a56-c966-773d-9c9467f40729' joined, marking health alive
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:22.693826 [DEBUG] agent: Node info in sync
2019/12/30 18:55:22 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:22 [INFO]  raft: Node at 127.0.0.1:18340 [Leader] entering Leader state
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.901361 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:22.901832 [INFO] consul: New leader elected: Node f4b7d691-1431-932c-b92d-23752b33f979
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:23.194229 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:23.194923 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:23.417431 [INFO] agent: Synced node info
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:23.417559 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:23.676989 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:23.677526 [DEBUG] consul: Skipping self join check for "Node 6762e2ab-3aab-04a3-819d-fd58d2332bda" since the cluster is too small
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:23.677691 [INFO] consul: member 'Node 6762e2ab-3aab-04a3-819d-fd58d2332bda' joined, marking health alive
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.758085 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 3.125749ms) from client 127.0.0.1:40116 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.758278 [DEBUG] dns: request for name a955ab87-b47e-fd69-2fc9-8108cfb034e1.query.consul. type ANY class IN (took 1.362036ms) from client 127.0.0.1:46267 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.758474 [DEBUG] tlsutil: Update with version 2
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.759245 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.761573 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.423038ms) from client 127.0.0.1:44394 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.762165 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.770734 [INFO] consul: shutting down server
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:23.771185 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.036604 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.494702 [INFO] manager: shutting down
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.510237 [INFO] agent: consul server down
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.510423 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.510488 [INFO] agent: Stopping DNS server 127.0.0.1:18329 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.510671 [INFO] agent: Stopping DNS server 127.0.0.1:18329 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.510888 [INFO] agent: Stopping HTTP server 127.0.0.1:18330 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.511238 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.511313 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_OnlyPassing (5.58s)
=== CONT  TestDNS_ServiceLookup_FilterCritical
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.528628 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookup_OnlyPassing - 2019/12/30 18:55:24.528723 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:24.639683 [WARN] agent: Node name "Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:24.644759 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:24.653588 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.346034 [DEBUG] agent: Node info in sync
2019/12/30 18:55:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6de991a2-fc8a-dc21-e65f-c7adbf4b9a10 Address:127.0.0.1:18346}]
2019/12/30 18:55:25 [INFO]  raft: Node at 127.0.0.1:18346 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.800948 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.809460 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 772.021µs) from client 127.0.0.1:59933 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.811345 [DEBUG] dns: request for name 99aa4fb2-499b-49cb-20d8-b4a1438cbe30.query.consul. type ANY class IN (took 998.36µs) from client 127.0.0.1:48414 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.811525 [INFO] serf: EventMemberJoin: Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10.dc1 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.811567 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.811621 [INFO] consul: shutting down server
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.811663 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.823331 [INFO] serf: EventMemberJoin: Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10 127.0.0.1
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.824018 [INFO] consul: Adding LAN server Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10 (Addr: tcp/127.0.0.1:18346) (DC: dc1)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.824733 [INFO] consul: Handled member-join event for server "Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10.dc1" in area "wan"
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.826056 [INFO] agent: Started DNS server 127.0.0.1:18341 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.826170 [INFO] agent: Started DNS server 127.0.0.1:18341 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.830290 [INFO] agent: Started HTTP server on 127.0.0.1:18342 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:25.830379 [INFO] agent: started state syncer
2019/12/30 18:55:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:25 [INFO]  raft: Node at 127.0.0.1:18346 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:25.966716 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.047345 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.047799 [DEBUG] consul: Skipping self join check for "Node f4b7d691-1431-932c-b92d-23752b33f979" since the cluster is too small
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.047958 [INFO] consul: member 'Node f4b7d691-1431-932c-b92d-23752b33f979' joined, marking health alive
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.051290 [INFO] manager: shutting down
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.215862 [INFO] agent: consul server down
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.215940 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.216001 [INFO] agent: Stopping DNS server 127.0.0.1:18335 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.216145 [INFO] agent: Stopping DNS server 127.0.0.1:18335 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.216299 [INFO] agent: Stopping HTTP server 127.0.0.1:18336 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.216496 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.216570 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_OnlyFailing (5.75s)
=== CONT  TestDNS_RecursorTimeout
TestDNS_ServiceLookup_OnlyFailing - 2019/12/30 18:55:26.221506 [ERR] consul: failed to reconcile member: {Node f4b7d691-1431-932c-b92d-23752b33f979 127.0.0.1 18338 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:f4b7d691-1431-932c-b92d-23752b33f979 port:18340 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:18339] alive 1 5 2 2 5 4}: leadership lost while committing log
2019/12/30 18:55:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:26 [INFO]  raft: Node at 127.0.0.1:18346 [Leader] entering Leader state
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:26.539570 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:26.540186 [INFO] consul: New leader elected: Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_RecursorTimeout - 2019/12/30 18:55:26.580465 [WARN] agent: Node name "Node f51227dd-c9d0-8f19-e297-d747ad6a8657" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_RecursorTimeout - 2019/12/30 18:55:26.580963 [DEBUG] tlsutil: Update with version 1
TestDNS_RecursorTimeout - 2019/12/30 18:55:26.584134 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:26.834990 [INFO] agent: Synced node info
2019/12/30 18:55:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f51227dd-c9d0-8f19-e297-d747ad6a8657 Address:127.0.0.1:18352}]
2019/12/30 18:55:27 [INFO]  raft: Node at 127.0.0.1:18352 [Follower] entering Follower state (Leader: "")
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.655488 [INFO] serf: EventMemberJoin: Node f51227dd-c9d0-8f19-e297-d747ad6a8657.dc1 127.0.0.1
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.658656 [INFO] serf: EventMemberJoin: Node f51227dd-c9d0-8f19-e297-d747ad6a8657 127.0.0.1
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.659777 [INFO] consul: Handled member-join event for server "Node f51227dd-c9d0-8f19-e297-d747ad6a8657.dc1" in area "wan"
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.659794 [INFO] consul: Adding LAN server Node f51227dd-c9d0-8f19-e297-d747ad6a8657 (Addr: tcp/127.0.0.1:18352) (DC: dc1)
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.661797 [DEBUG] dns: recursor enabled
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.662365 [DEBUG] dns: recursor enabled
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.662297 [INFO] agent: Started DNS server 127.0.0.1:18347 (tcp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.662957 [INFO] agent: Started DNS server 127.0.0.1:18347 (udp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.666380 [INFO] agent: Started HTTP server on 127.0.0.1:18348 (tcp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:27.666755 [INFO] agent: started state syncer
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:27.689628 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:27.689744 [DEBUG] agent: Node info in sync
2019/12/30 18:55:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:27 [INFO]  raft: Node at 127.0.0.1:18352 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:55:28.041760 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:55:28.041838 [DEBUG] agent: Node info in sync
2019/12/30 18:55:28 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:28 [INFO]  raft: Node at 127.0.0.1:18352 [Leader] entering Leader state
TestDNS_RecursorTimeout - 2019/12/30 18:55:28.442562 [INFO] consul: cluster leadership acquired
TestDNS_RecursorTimeout - 2019/12/30 18:55:28.443015 [INFO] consul: New leader elected: Node f51227dd-c9d0-8f19-e297-d747ad6a8657
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:28.628005 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:28.920309 [DEBUG] consul: Skipping self join check for "Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10" since the cluster is too small
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:28.920690 [INFO] consul: member 'Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10' joined, marking health alive
TestDNS_RecursorTimeout - 2019/12/30 18:55:29.009222 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.450629 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.455978 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 962.692µs) from client 127.0.0.1:38201 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.457744 [DEBUG] dns: request for name 64c3c387-8b35-690b-1d17-85703536175d.query.consul. type ANY class IN (took 929.025µs) from client 127.0.0.1:48021 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.457988 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.458055 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.458097 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.467457 [WARN] consul: error getting server health from "Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10": rpc error making call: EOF
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.555846 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.701144 [INFO] manager: shutting down
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.701653 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.701706 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.701758 [INFO] agent: Stopping DNS server 127.0.0.1:18341 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.701913 [INFO] agent: Stopping DNS server 127.0.0.1:18341 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.702076 [INFO] agent: Stopping HTTP server 127.0.0.1:18342 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.702313 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:29.702385 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_FilterCritical (5.19s)
=== CONT  TestDNS_Recurse_Truncation
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Recurse_Truncation - 2019/12/30 18:55:29.787385 [WARN] agent: Node name "Node 779461ab-8b22-059e-002a-064adebb5201" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Recurse_Truncation - 2019/12/30 18:55:29.787794 [DEBUG] tlsutil: Update with version 1
TestDNS_Recurse_Truncation - 2019/12/30 18:55:29.792295 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_RecursorTimeout - 2019/12/30 18:55:30.192451 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_RecursorTimeout - 2019/12/30 18:55:30.192958 [DEBUG] consul: Skipping self join check for "Node f51227dd-c9d0-8f19-e297-d747ad6a8657" since the cluster is too small
TestDNS_RecursorTimeout - 2019/12/30 18:55:30.193119 [INFO] consul: member 'Node f51227dd-c9d0-8f19-e297-d747ad6a8657' joined, marking health alive
TestDNS_ServiceLookup_FilterCritical - 2019/12/30 18:55:30.450769 [WARN] consul: error getting server health from "Node 6de991a2-fc8a-dc21-e65f-c7adbf4b9a10": context deadline exceeded
TestDNS_RecursorTimeout - 2019/12/30 18:55:30.503332 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDNS_RecursorTimeout - 2019/12/30 18:55:30.503415 [DEBUG] agent: Node info in sync
TestDNS_RecursorTimeout - 2019/12/30 18:55:30.503493 [DEBUG] agent: Node info in sync
2019/12/30 18:55:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:779461ab-8b22-059e-002a-064adebb5201 Address:127.0.0.1:18358}]
2019/12/30 18:55:30 [INFO]  raft: Node at 127.0.0.1:18358 [Follower] entering Follower state (Leader: "")
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.888615 [INFO] serf: EventMemberJoin: Node 779461ab-8b22-059e-002a-064adebb5201.dc1 127.0.0.1
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.898252 [INFO] serf: EventMemberJoin: Node 779461ab-8b22-059e-002a-064adebb5201 127.0.0.1
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.900191 [INFO] consul: Adding LAN server Node 779461ab-8b22-059e-002a-064adebb5201 (Addr: tcp/127.0.0.1:18358) (DC: dc1)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.900358 [INFO] consul: Handled member-join event for server "Node 779461ab-8b22-059e-002a-064adebb5201.dc1" in area "wan"
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.901470 [DEBUG] dns: recursor enabled
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.901514 [DEBUG] dns: recursor enabled
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.902022 [INFO] agent: Started DNS server 127.0.0.1:18353 (udp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.902084 [INFO] agent: Started DNS server 127.0.0.1:18353 (tcp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.905052 [INFO] agent: Started HTTP server on 127.0.0.1:18354 (tcp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:30.905197 [INFO] agent: started state syncer
2019/12/30 18:55:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:30 [INFO]  raft: Node at 127.0.0.1:18358 [Candidate] entering Candidate state in term 2
TestDNS_RecursorTimeout - 2019/12/30 18:55:31.281898 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:55:31.520627 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:55:31.520717 [DEBUG] agent: Node info in sync
2019/12/30 18:55:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:31 [INFO]  raft: Node at 127.0.0.1:18358 [Leader] entering Leader state
TestDNS_Recurse_Truncation - 2019/12/30 18:55:31.692449 [INFO] consul: cluster leadership acquired
TestDNS_Recurse_Truncation - 2019/12/30 18:55:31.692885 [INFO] consul: New leader elected: Node 779461ab-8b22-059e-002a-064adebb5201
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.028376 [ERR] dns: recurse failed: read udp 127.0.0.1:54134->127.0.0.1:42008: i/o timeout
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.028510 [ERR] dns: all resolvers failed for {apple.com. 255 1} from client 127.0.0.1:58314 (udp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.028926 [INFO] agent: Requesting shutdown
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.029000 [INFO] consul: shutting down server
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.029047 [WARN] serf: Shutdown without a Leave
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.029912 [DEBUG] dns: request for {apple.com. 255 1} (udp) (3.003179662s) from client 127.0.0.1:58314 (udp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.175058 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.188280 [INFO] agent: Synced node info
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.188416 [DEBUG] agent: Node info in sync
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.213604 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (437.345µs) Recursor queried: 127.0.0.1:59791
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.213873 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.169364ms) from client 127.0.0.1:34002 (udp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.214298 [INFO] agent: Requesting shutdown
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.214468 [INFO] consul: shutting down server
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.214524 [WARN] serf: Shutdown without a Leave
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.359272 [INFO] manager: shutting down
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360086 [INFO] agent: consul server down
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360142 [INFO] agent: shutdown complete
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360201 [INFO] agent: Stopping DNS server 127.0.0.1:18347 (tcp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360336 [INFO] agent: Stopping DNS server 127.0.0.1:18347 (udp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360502 [INFO] agent: Stopping HTTP server 127.0.0.1:18348 (tcp)
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360806 [INFO] agent: Waiting for endpoints to shut down
TestDNS_RecursorTimeout - 2019/12/30 18:55:32.360928 [INFO] agent: Endpoints down
=== CONT  TestDNS_Recurse
--- PASS: TestDNS_RecursorTimeout (6.14s)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.369205 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.448223 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Recurse - 2019/12/30 18:55:32.460354 [WARN] agent: Node name "Node 9fe8f0ac-db49-b047-b8da-498f7578989e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Recurse - 2019/12/30 18:55:32.460782 [DEBUG] tlsutil: Update with version 1
TestDNS_Recurse - 2019/12/30 18:55:32.463486 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.708924 [INFO] agent: consul server down
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.709012 [INFO] agent: shutdown complete
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.709076 [INFO] agent: Stopping DNS server 127.0.0.1:18353 (tcp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.709231 [INFO] agent: Stopping DNS server 127.0.0.1:18353 (udp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.709469 [INFO] agent: Stopping HTTP server 127.0.0.1:18354 (tcp)
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.709685 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.709766 [INFO] agent: Endpoints down
--- PASS: TestDNS_Recurse_Truncation (3.01s)
=== CONT  TestDNS_ServiceLookup_Dedup_SRV
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.715420 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Recurse_Truncation - 2019/12/30 18:55:32.715693 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:32.838643 [WARN] agent: Node name "Node dad8cc0c-940f-f8cd-1212-9cce413fb293" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:32.839781 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:32.846416 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9fe8f0ac-db49-b047-b8da-498f7578989e Address:127.0.0.1:18364}]
2019/12/30 18:55:33 [INFO]  raft: Node at 127.0.0.1:18364 [Follower] entering Follower state (Leader: "")
TestDNS_Recurse - 2019/12/30 18:55:33.536420 [INFO] serf: EventMemberJoin: Node 9fe8f0ac-db49-b047-b8da-498f7578989e.dc1 127.0.0.1
TestDNS_Recurse - 2019/12/30 18:55:33.540137 [INFO] serf: EventMemberJoin: Node 9fe8f0ac-db49-b047-b8da-498f7578989e 127.0.0.1
TestDNS_Recurse - 2019/12/30 18:55:33.541773 [DEBUG] dns: recursor enabled
TestDNS_Recurse - 2019/12/30 18:55:33.542720 [INFO] agent: Started DNS server 127.0.0.1:18359 (udp)
TestDNS_Recurse - 2019/12/30 18:55:33.543130 [INFO] consul: Adding LAN server Node 9fe8f0ac-db49-b047-b8da-498f7578989e (Addr: tcp/127.0.0.1:18364) (DC: dc1)
TestDNS_Recurse - 2019/12/30 18:55:33.543327 [INFO] consul: Handled member-join event for server "Node 9fe8f0ac-db49-b047-b8da-498f7578989e.dc1" in area "wan"
TestDNS_Recurse - 2019/12/30 18:55:33.543647 [DEBUG] dns: recursor enabled
TestDNS_Recurse - 2019/12/30 18:55:33.544140 [INFO] agent: Started DNS server 127.0.0.1:18359 (tcp)
TestDNS_Recurse - 2019/12/30 18:55:33.547377 [INFO] agent: Started HTTP server on 127.0.0.1:18360 (tcp)
TestDNS_Recurse - 2019/12/30 18:55:33.547492 [INFO] agent: started state syncer
2019/12/30 18:55:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:33 [INFO]  raft: Node at 127.0.0.1:18364 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:dad8cc0c-940f-f8cd-1212-9cce413fb293 Address:127.0.0.1:18370}]
2019/12/30 18:55:33 [INFO]  raft: Node at 127.0.0.1:18370 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.872402 [INFO] serf: EventMemberJoin: Node dad8cc0c-940f-f8cd-1212-9cce413fb293.dc1 127.0.0.1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.879573 [INFO] serf: EventMemberJoin: Node dad8cc0c-940f-f8cd-1212-9cce413fb293 127.0.0.1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.880271 [INFO] consul: Adding LAN server Node dad8cc0c-940f-f8cd-1212-9cce413fb293 (Addr: tcp/127.0.0.1:18370) (DC: dc1)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.880913 [INFO] consul: Handled member-join event for server "Node dad8cc0c-940f-f8cd-1212-9cce413fb293.dc1" in area "wan"
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.881075 [INFO] agent: Started DNS server 127.0.0.1:18365 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.881419 [INFO] agent: Started DNS server 127.0.0.1:18365 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.883677 [INFO] agent: Started HTTP server on 127.0.0.1:18366 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:33.883758 [INFO] agent: started state syncer
2019/12/30 18:55:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:33 [INFO]  raft: Node at 127.0.0.1:18370 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:34 [INFO]  raft: Node at 127.0.0.1:18364 [Leader] entering Leader state
TestDNS_Recurse - 2019/12/30 18:55:34.223872 [INFO] consul: cluster leadership acquired
TestDNS_Recurse - 2019/12/30 18:55:34.224258 [INFO] consul: New leader elected: Node 9fe8f0ac-db49-b047-b8da-498f7578989e
jones - 2019/12/30 18:55:35.988739 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:55:35.988831 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/30 18:55:35.988865 [DEBUG] agent: Node info in sync
2019/12/30 18:55:36 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:36 [INFO]  raft: Node at 127.0.0.1:18370 [Leader] entering Leader state
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:36.004663 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:36.005108 [INFO] consul: New leader elected: Node dad8cc0c-940f-f8cd-1212-9cce413fb293
TestDNS_Recurse - 2019/12/30 18:55:36.135558 [INFO] agent: Synced node info
TestDNS_Recurse - 2019/12/30 18:55:36.142066 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (445.345µs) Recursor queried: 127.0.0.1:42769
TestDNS_Recurse - 2019/12/30 18:55:36.142316 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.112029ms) from client 127.0.0.1:56365 (udp)
TestDNS_Recurse - 2019/12/30 18:55:36.142389 [INFO] agent: Requesting shutdown
TestDNS_Recurse - 2019/12/30 18:55:36.142474 [INFO] consul: shutting down server
TestDNS_Recurse - 2019/12/30 18:55:36.142527 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse - 2019/12/30 18:55:36.275198 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse - 2019/12/30 18:55:36.392015 [INFO] manager: shutting down
TestDNS_Recurse - 2019/12/30 18:55:36.392614 [INFO] agent: consul server down
TestDNS_Recurse - 2019/12/30 18:55:36.392686 [INFO] agent: shutdown complete
TestDNS_Recurse - 2019/12/30 18:55:36.392747 [INFO] agent: Stopping DNS server 127.0.0.1:18359 (tcp)
TestDNS_Recurse - 2019/12/30 18:55:36.392912 [INFO] agent: Stopping DNS server 127.0.0.1:18359 (udp)
TestDNS_Recurse - 2019/12/30 18:55:36.393081 [INFO] agent: Stopping HTTP server 127.0.0.1:18360 (tcp)
TestDNS_Recurse - 2019/12/30 18:55:36.393294 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_Recurse - 2019/12/30 18:55:36.393310 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Recurse - 2019/12/30 18:55:36.393416 [INFO] agent: Endpoints down
--- PASS: TestDNS_Recurse (4.04s)
=== CONT  TestDNS_ServiceLookup_PreparedQueryNamePeriod
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:36.467671 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:36.515370 [WARN] agent: Node name "Node 382b8eff-b955-1f4d-644f-a6d778c2b4dc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:36.516005 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:36.527495 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:36.870214 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:36.870350 [DEBUG] agent: Node info in sync
2019/12/30 18:55:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:382b8eff-b955-1f4d-644f-a6d778c2b4dc Address:127.0.0.1:18376}]
2019/12/30 18:55:37 [INFO]  raft: Node at 127.0.0.1:18376 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.638007 [INFO] serf: EventMemberJoin: Node 382b8eff-b955-1f4d-644f-a6d778c2b4dc.dc1 127.0.0.1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.648365 [INFO] serf: EventMemberJoin: Node 382b8eff-b955-1f4d-644f-a6d778c2b4dc 127.0.0.1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.649806 [INFO] consul: Adding LAN server Node 382b8eff-b955-1f4d-644f-a6d778c2b4dc (Addr: tcp/127.0.0.1:18376) (DC: dc1)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.650115 [INFO] consul: Handled member-join event for server "Node 382b8eff-b955-1f4d-644f-a6d778c2b4dc.dc1" in area "wan"
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.651977 [INFO] agent: Started DNS server 127.0.0.1:18371 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.652230 [INFO] agent: Started DNS server 127.0.0.1:18371 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.654683 [INFO] agent: Started HTTP server on 127.0.0.1:18372 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:37.654793 [INFO] agent: started state syncer
2019/12/30 18:55:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:37 [INFO]  raft: Node at 127.0.0.1:18376 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:38 [INFO]  raft: Node at 127.0.0.1:18376 [Leader] entering Leader state
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:38.367781 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:38.368218 [INFO] consul: New leader elected: Node 382b8eff-b955-1f4d-644f-a6d778c2b4dc
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:38.825919 [INFO] agent: Synced node info
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:38.826039 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:38.843114 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.087362ms) from client 127.0.0.1:37523 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:38.845707 [DEBUG] dns: request for name defbaf9c-a2c0-fbc1-0fd7-b7daf8bb9115.query.consul. type SRV class IN (took 1.073695ms) from client 127.0.0.1:34204 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:38.846401 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:38.846473 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:38.846519 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:38.968628 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.058874 [INFO] manager: shutting down
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.060073 [INFO] agent: consul server down
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.060169 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.060259 [INFO] agent: Stopping DNS server 127.0.0.1:18365 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.060476 [INFO] agent: Stopping DNS server 127.0.0.1:18365 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.060671 [INFO] agent: Stopping HTTP server 127.0.0.1:18366 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.060916 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.061007 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Dedup_SRV (6.35s)
=== CONT  TestDNS_PreparedQueryNearIP
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.064712 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/30 18:55:39.064807 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:39.126937 [WARN] agent: Node name "Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:39.127384 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:39.129578 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:39.522902 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:39.866916 [DEBUG] dns: request for name some.query.we.like.query.consul. type SRV class IN (took 1.13703ms) from client 127.0.0.1:43212 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:39.867047 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:39.867125 [INFO] consul: shutting down server
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:39.867169 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:39.937833 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.033723 [INFO] manager: shutting down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.035452 [INFO] agent: consul server down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.035508 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.035563 [INFO] agent: Stopping DNS server 127.0.0.1:18371 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.035703 [INFO] agent: Stopping DNS server 127.0.0.1:18371 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.035865 [INFO] agent: Stopping HTTP server 127.0.0.1:18372 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.036121 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.036205 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_PreparedQueryNamePeriod (3.64s)
=== CONT  TestDNS_PreparedQueryNearIPEDNS
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/30 18:55:40.038875 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:40.118141 [WARN] agent: Node name "Node 277c3ae9-12dc-2861-c44b-6a645bdff006" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:40.118774 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:40.121768 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:69cf53d6-bffc-28b8-5d42-97b2d78d1857 Address:127.0.0.1:18382}]
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.438635 [INFO] serf: EventMemberJoin: Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857.dc1 127.0.0.1
2019/12/30 18:55:40 [INFO]  raft: Node at 127.0.0.1:18382 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.447099 [INFO] serf: EventMemberJoin: Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857 127.0.0.1
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.448047 [INFO] consul: Adding LAN server Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857 (Addr: tcp/127.0.0.1:18382) (DC: dc1)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.448584 [INFO] consul: Handled member-join event for server "Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857.dc1" in area "wan"
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.450016 [INFO] agent: Started DNS server 127.0.0.1:18377 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.450112 [INFO] agent: Started DNS server 127.0.0.1:18377 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.452569 [INFO] agent: Started HTTP server on 127.0.0.1:18378 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:40.452686 [INFO] agent: started state syncer
2019/12/30 18:55:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:40 [INFO]  raft: Node at 127.0.0.1:18382 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:55:40.945476 [DEBUG] consul: Skipping self join check for "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" since the cluster is too small
2019/12/30 18:55:41 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:41 [INFO]  raft: Node at 127.0.0.1:18382 [Leader] entering Leader state
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:41.266109 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:41.266589 [INFO] consul: New leader elected: Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857
2019/12/30 18:55:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:277c3ae9-12dc-2861-c44b-6a645bdff006 Address:127.0.0.1:18388}]
2019/12/30 18:55:41 [INFO]  raft: Node at 127.0.0.1:18388 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.531661 [INFO] serf: EventMemberJoin: Node 277c3ae9-12dc-2861-c44b-6a645bdff006.dc1 127.0.0.1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.537277 [INFO] serf: EventMemberJoin: Node 277c3ae9-12dc-2861-c44b-6a645bdff006 127.0.0.1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.538108 [INFO] consul: Adding LAN server Node 277c3ae9-12dc-2861-c44b-6a645bdff006 (Addr: tcp/127.0.0.1:18388) (DC: dc1)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.538887 [INFO] consul: Handled member-join event for server "Node 277c3ae9-12dc-2861-c44b-6a645bdff006.dc1" in area "wan"
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.542189 [INFO] agent: Started DNS server 127.0.0.1:18383 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.542768 [INFO] agent: Started DNS server 127.0.0.1:18383 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.546062 [INFO] agent: Started HTTP server on 127.0.0.1:18384 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:41.546189 [INFO] agent: started state syncer
2019/12/30 18:55:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:41 [INFO]  raft: Node at 127.0.0.1:18388 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:41.726636 [INFO] agent: Synced node info
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:41.726770 [DEBUG] agent: Node info in sync
2019/12/30 18:55:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:42 [INFO]  raft: Node at 127.0.0.1:18388 [Leader] entering Leader state
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:42.211431 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:42.211905 [INFO] consul: New leader elected: Node 277c3ae9-12dc-2861-c44b-6a645bdff006
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:42.617974 [INFO] agent: Synced node info
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:42.618112 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:42.711479 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:55:43.319297 [DEBUG] consul: Skipping self join check for "Node 90e88a15-5862-4de0-2f1f-c638261bac76" since the cluster is too small
Added 3 service nodes
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:43.585382 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:44.220380 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
Added 3 service nodes
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.086878 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.089605 [DEBUG] consul: Skipping self join check for "Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857" since the cluster is too small
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.089899 [INFO] consul: member 'Node 69cf53d6-bffc-28b8-5d42-97b2d78d1857' joined, marking health alive
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.091654 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 3.018414ms) from client 127.0.0.1:35268 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.118671 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.03036ms) from client 127.0.0.1:51103 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.146111 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.012027ms) from client 127.0.0.1:39210 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.173619 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.075029ms) from client 127.0.0.1:51210 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.201086 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.113696ms) from client 127.0.0.1:43527 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.228395 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.028694ms) from client 127.0.0.1:55413 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.255881 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.108696ms) from client 127.0.0.1:34056 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.256015 [INFO] agent: Requesting shutdown
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.256109 [INFO] consul: shutting down server
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.256165 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:55:45.418873 [DEBUG] consul: Skipping self join check for "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" since the cluster is too small
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:45.421636 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.422808 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.660950 [INFO] manager: shutting down
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.661470 [INFO] agent: consul server down
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.661525 [INFO] agent: shutdown complete
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.661584 [INFO] agent: Stopping DNS server 127.0.0.1:18377 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.661722 [INFO] agent: Stopping DNS server 127.0.0.1:18377 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.661898 [INFO] agent: Stopping HTTP server 127.0.0.1:18378 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.662141 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQueryNearIP - 2019/12/30 18:55:45.662217 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQueryNearIP (6.60s)
=== CONT  TestDNS_ServiceLookup_TagPeriod
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:45.725169 [WARN] agent: Node name "Node 2410fa8a-cdc5-d558-c3b8-432c3b33fcc6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:45.725872 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:45.728568 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.935969 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.936384 [DEBUG] consul: Skipping self join check for "Node 277c3ae9-12dc-2861-c44b-6a645bdff006" since the cluster is too small
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.936539 [INFO] consul: member 'Node 277c3ae9-12dc-2861-c44b-6a645bdff006' joined, marking health alive
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.939463 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.368369ms) from client 127.0.0.1:40542 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.939831 [INFO] agent: Requesting shutdown
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.939955 [INFO] consul: shutting down server
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:46.940026 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.538721 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.739465 [INFO] manager: shutting down
jones - 2019/12/30 18:55:47.739693 [DEBUG] consul: Skipping self join check for "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" since the cluster is too small
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.740307 [INFO] agent: consul server down
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.740376 [INFO] agent: shutdown complete
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.740433 [INFO] agent: Stopping DNS server 127.0.0.1:18383 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.740750 [WARN] consul: error getting server health from "Node 277c3ae9-12dc-2861-c44b-6a645bdff006": rpc error making call: EOF
2019/12/30 18:55:47 [WARN]  raft: could not get configuration for Stats: raft is already shutdown
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.742336 [INFO] agent: Stopping DNS server 127.0.0.1:18383 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.742662 [INFO] agent: Stopping HTTP server 127.0.0.1:18384 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.743008 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:47.743246 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQueryNearIPEDNS (7.71s)
=== CONT  TestDNS_CaseInsensitiveServiceLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:47.815048 [WARN] agent: Node name "Node f47429b6-1650-236f-b72a-2ceacf3d811c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:47.818556 [DEBUG] tlsutil: Update with version 1
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:47.822551 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/30 18:55:48.737976 [WARN] consul: error getting server health from "Node 277c3ae9-12dc-2861-c44b-6a645bdff006": context deadline exceeded
2019/12/30 18:55:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2410fa8a-cdc5-d558-c3b8-432c3b33fcc6 Address:127.0.0.1:18394}]
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.820218 [INFO] serf: EventMemberJoin: Node 2410fa8a-cdc5-d558-c3b8-432c3b33fcc6.dc1 127.0.0.1
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.824557 [INFO] serf: EventMemberJoin: Node 2410fa8a-cdc5-d558-c3b8-432c3b33fcc6 127.0.0.1
2019/12/30 18:55:48 [INFO]  raft: Node at 127.0.0.1:18394 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.827607 [INFO] consul: Handled member-join event for server "Node 2410fa8a-cdc5-d558-c3b8-432c3b33fcc6.dc1" in area "wan"
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.828574 [INFO] consul: Adding LAN server Node 2410fa8a-cdc5-d558-c3b8-432c3b33fcc6 (Addr: tcp/127.0.0.1:18394) (DC: dc1)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.831095 [INFO] agent: Started DNS server 127.0.0.1:18389 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.831606 [INFO] agent: Started DNS server 127.0.0.1:18389 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.834166 [INFO] agent: Started HTTP server on 127.0.0.1:18390 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:48.834280 [INFO] agent: started state syncer
2019/12/30 18:55:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:48 [INFO]  raft: Node at 127.0.0.1:18394 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:55:49.204017 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:55:49.204104 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:49.240789 [WARN] consul: error getting server health from "Node c3319e0f-3a56-c966-773d-9c9467f40729": context deadline exceeded
jones - 2019/12/30 18:55:50.302309 [DEBUG] consul: Skipping self join check for "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" since the cluster is too small
2019/12/30 18:55:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:50 [INFO]  raft: Node at 127.0.0.1:18394 [Leader] entering Leader state
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:50.896075 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:50.897332 [INFO] consul: New leader elected: Node 2410fa8a-cdc5-d558-c3b8-432c3b33fcc6
2019/12/30 18:55:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f47429b6-1650-236f-b72a-2ceacf3d811c Address:127.0.0.1:18400}]
2019/12/30 18:55:51 [INFO]  raft: Node at 127.0.0.1:18400 [Follower] entering Follower state (Leader: "")
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.138375 [INFO] serf: EventMemberJoin: Node f47429b6-1650-236f-b72a-2ceacf3d811c.dc1 127.0.0.1
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.142073 [INFO] serf: EventMemberJoin: Node f47429b6-1650-236f-b72a-2ceacf3d811c 127.0.0.1
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.143000 [INFO] consul: Handled member-join event for server "Node f47429b6-1650-236f-b72a-2ceacf3d811c.dc1" in area "wan"
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.143388 [INFO] consul: Adding LAN server Node f47429b6-1650-236f-b72a-2ceacf3d811c (Addr: tcp/127.0.0.1:18400) (DC: dc1)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.143678 [INFO] agent: Started DNS server 127.0.0.1:18395 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.143740 [INFO] agent: Started DNS server 127.0.0.1:18395 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.164450 [INFO] agent: Started HTTP server on 127.0.0.1:18396 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:51.164571 [INFO] agent: started state syncer
2019/12/30 18:55:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:51 [INFO]  raft: Node at 127.0.0.1:18400 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:55:51.578612 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:55:51.578711 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:52.180652 [INFO] agent: Synced node info
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:52.180861 [DEBUG] agent: Node info in sync
2019/12/30 18:55:52 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:52 [INFO]  raft: Node at 127.0.0.1:18400 [Leader] entering Leader state
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:52.624216 [INFO] consul: cluster leadership acquired
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:52.624877 [INFO] consul: New leader elected: Node f47429b6-1650-236f-b72a-2ceacf3d811c
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:53.769887 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.008374 [DEBUG] dns: request for name v1.master2.db.service.consul. type SRV class IN (took 757.353µs) from client 127.0.0.1:41888 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.010377 [DEBUG] dns: request for name v1.master.db.service.consul. type SRV class IN (took 736.02µs) from client 127.0.0.1:33651 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.010522 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.010619 [INFO] consul: shutting down server
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.010676 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:54.052328 [WARN] consul: error getting server health from "Node 6762e2ab-3aab-04a3-819d-fd58d2332bda": context deadline exceeded
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.317244 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:54.318524 [INFO] agent: Synced node info
jones - 2019/12/30 18:55:54.318675 [DEBUG] consul: Skipping self join check for "Node 5122c9d8-8979-c841-956f-094a90e62880" since the cluster is too small
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.344264 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 20.069197ms) from client 127.0.0.1:49691 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.349191 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.349305 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.349361 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.351967 [DEBUG] dns: request for name a3fd1f8d-0be2-4461-a28f-00473794834e.query.consul. type ANY class IN (took 13.48669ms) from client 127.0.0.1:60628 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.467427 [INFO] manager: shutting down
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.600686 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:54.623067 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:54.623193 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.676061 [INFO] agent: consul server down
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.676266 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.676360 [INFO] agent: Stopping DNS server 127.0.0.1:18389 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.676557 [INFO] agent: Stopping DNS server 127.0.0.1:18389 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.676743 [INFO] agent: Stopping HTTP server 127.0.0.1:18390 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.676995 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.677100 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_TagPeriod (9.01s)
=== CONT  TestDNS_ServiceLookup_ServiceAddressIPV6
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.677627 [INFO] manager: shutting down
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.678433 [INFO] agent: consul server down
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.678499 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.678554 [INFO] agent: Stopping DNS server 127.0.0.1:18317 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.678717 [INFO] agent: Stopping DNS server 127.0.0.1:18317 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.679133 [INFO] agent: Stopping HTTP server 127.0.0.1:18318 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.679496 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Truncate - 2019/12/30 18:55:54.679699 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Truncate (37.26s)
=== CONT  TestDNS_ServiceLookup_ServiceAddress_CNAME
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.682246 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.682498 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_TagPeriod - 2019/12/30 18:55:54.682570 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:54.831741 [WARN] agent: Node name "Node ec4c7197-2aee-c4b6-c1d8-77d0258eb70f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:54.832024 [WARN] agent: Node name "Node a959db6c-f0ee-b7b4-8753-17fcc3d22503" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:54.832842 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:54.834798 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:54.838082 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:54.840981 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.308754 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 11.570306ms) from client 127.0.0.1:55747 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.313276 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 11.478303ms) from client 127.0.0.1:46032 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.316457 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 6.732511ms) from client 127.0.0.1:40947 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.346751 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 23.673625ms) from client 127.0.0.1:38554 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.362055 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 7.434529ms) from client 127.0.0.1:50869 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.367654 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 6.876182ms) from client 127.0.0.1:39235 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.388689 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 8.795233ms) from client 127.0.0.1:56672 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.392050 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.841821ms) from client 127.0.0.1:49446 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.392342 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 15.098398ms) from client 127.0.0.1:35627 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.398789 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.722151ms) from client 127.0.0.1:59235 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.404962 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 5.42981ms) from client 127.0.0.1:40690 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.417645 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 11.637974ms) from client 127.0.0.1:39549 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.423826 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 5.354808ms) from client 127.0.0.1:50465 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.431174 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 6.390502ms) from client 127.0.0.1:51955 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.486486 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 53.425743ms) from client 127.0.0.1:58362 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.491834 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 21.271562ms) from client 127.0.0.1:52120 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.497073 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 5.390142ms) from client 127.0.0.1:42849 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.504658 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 6.81318ms) from client 127.0.0.1:39854 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.509800 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 5.416143ms) from client 127.0.0.1:50462 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.517136 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.517235 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.517289 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:55.518650 [DEBUG] dns: request for name 0801cb04-4e8f-7e81-b63d-ae70fcc8bafd.query.consul. type ANY class IN (took 7.902875ms) from client 127.0.0.1:44420 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.337712 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.375095 [DEBUG] dns: request for name master.db.service.consul. type SRV class IN (took 571.349µs) from client 127.0.0.1:56871 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.376907 [DEBUG] dns: request for name mASTER.dB.service.consul. type SRV class IN (took 508.68µs) from client 127.0.0.1:39400 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.378141 [DEBUG] dns: request for name MASTER.dB.service.consul. type SRV class IN (took 492.346µs) from client 127.0.0.1:51565 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.379714 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 766.687µs) from client 127.0.0.1:54507 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.381113 [DEBUG] dns: request for name DB.service.consul. type SRV class IN (took 722.019µs) from client 127.0.0.1:50578 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.382399 [DEBUG] dns: request for name Db.service.consul. type SRV class IN (took 651.351µs) from client 127.0.0.1:41479 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.383874 [DEBUG] dns: request for name somequery.query.consul. type SRV class IN (took 797.021µs) from client 127.0.0.1:45553 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.385668 [DEBUG] dns: request for name SomeQuery.query.consul. type SRV class IN (took 897.357µs) from client 127.0.0.1:56698 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.387077 [DEBUG] dns: request for name SOMEQUERY.query.consul. type SRV class IN (took 771.354µs) from client 127.0.0.1:46727 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.387267 [INFO] agent: Requesting shutdown
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.387411 [INFO] consul: shutting down server
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.387540 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.767412 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:55:56.768930 [DEBUG] consul: Skipping self join check for "Node a8b3e297-b53a-bcd0-efda-5addcd938805" since the cluster is too small
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.771451 [INFO] manager: shutting down
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772151 [INFO] agent: consul server down
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772202 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772253 [INFO] agent: Stopping DNS server 127.0.0.1:18323 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772389 [INFO] agent: Stopping DNS server 127.0.0.1:18323 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772557 [INFO] agent: Stopping HTTP server 127.0.0.1:18324 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772767 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Randomize - 2019/12/30 18:55:56.772851 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Randomize (38.35s)
=== CONT  TestDNS_ServiceLookup_ServiceAddress_A
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:56.850047 [WARN] agent: Node name "Node bcdbbdc6-ce50-6df1-06c4-8469cef49998" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:56.850593 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:56.853280 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.892399 [INFO] manager: shutting down
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.893224 [INFO] agent: consul server down
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.893305 [INFO] agent: shutdown complete
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.893376 [INFO] agent: Stopping DNS server 127.0.0.1:18395 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.893615 [INFO] agent: Stopping DNS server 127.0.0.1:18395 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.893852 [INFO] agent: Stopping HTTP server 127.0.0.1:18396 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.894099 [INFO] agent: Waiting for endpoints to shut down
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.894184 [INFO] agent: Endpoints down
--- PASS: TestDNS_CaseInsensitiveServiceLookup (9.15s)
=== CONT  TestDNS_ExternalServiceToConsulCNAMENestedLookup
TestDNS_CaseInsensitiveServiceLookup - 2019/12/30 18:55:56.898022 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:56.972135 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:56.974888 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a959db6c-f0ee-b7b4-8753-17fcc3d22503 Address:127.0.0.1:18406}]
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18406 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.114374 [INFO] serf: EventMemberJoin: Node a959db6c-f0ee-b7b4-8753-17fcc3d22503.dc1 127.0.0.1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.117704 [INFO] serf: EventMemberJoin: Node a959db6c-f0ee-b7b4-8753-17fcc3d22503 127.0.0.1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.118838 [INFO] consul: Adding LAN server Node a959db6c-f0ee-b7b4-8753-17fcc3d22503 (Addr: tcp/127.0.0.1:18406) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.119149 [INFO] consul: Handled member-join event for server "Node a959db6c-f0ee-b7b4-8753-17fcc3d22503.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.119309 [INFO] agent: Started DNS server 127.0.0.1:18401 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.119683 [INFO] agent: Started DNS server 127.0.0.1:18401 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.122120 [INFO] agent: Started HTTP server on 127.0.0.1:18402 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.122245 [INFO] agent: started state syncer
2019/12/30 18:55:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18406 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ec4c7197-2aee-c4b6-c1d8-77d0258eb70f Address:127.0.0.1:18412}]
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18412 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.213186 [INFO] serf: EventMemberJoin: Node ec4c7197-2aee-c4b6-c1d8-77d0258eb70f.dc1 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.216473 [INFO] serf: EventMemberJoin: Node ec4c7197-2aee-c4b6-c1d8-77d0258eb70f 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.217292 [INFO] consul: Handled member-join event for server "Node ec4c7197-2aee-c4b6-c1d8-77d0258eb70f.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.217596 [INFO] consul: Adding LAN server Node ec4c7197-2aee-c4b6-c1d8-77d0258eb70f (Addr: tcp/127.0.0.1:18412) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.219368 [INFO] agent: Started DNS server 127.0.0.1:18407 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.219747 [INFO] agent: Started DNS server 127.0.0.1:18407 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.224988 [INFO] agent: Started HTTP server on 127.0.0.1:18408 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.225417 [INFO] agent: started state syncer
2019/12/30 18:55:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18412 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18406 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.748394 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:57.748794 [INFO] consul: New leader elected: Node a959db6c-f0ee-b7b4-8753-17fcc3d22503
2019/12/30 18:55:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bcdbbdc6-ce50-6df1-06c4-8469cef49998 Address:127.0.0.1:18418}]
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.939314 [INFO] serf: EventMemberJoin: Node bcdbbdc6-ce50-6df1-06c4-8469cef49998.dc1 127.0.0.1
2019/12/30 18:55:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18412 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.946859 [INFO] serf: EventMemberJoin: Node bcdbbdc6-ce50-6df1-06c4-8469cef49998 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.947995 [INFO] agent: Started DNS server 127.0.0.1:18413 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.948943 [INFO] consul: Adding LAN server Node bcdbbdc6-ce50-6df1-06c4-8469cef49998 (Addr: tcp/127.0.0.1:18418) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.949171 [INFO] consul: Handled member-join event for server "Node bcdbbdc6-ce50-6df1-06c4-8469cef49998.dc1" in area "wan"
2019/12/30 18:55:57 [INFO]  raft: Node at 127.0.0.1:18418 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.949836 [INFO] agent: Started DNS server 127.0.0.1:18413 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.950259 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:57.950586 [INFO] consul: New leader elected: Node ec4c7197-2aee-c4b6-c1d8-77d0258eb70f
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.952109 [INFO] agent: Started HTTP server on 127.0.0.1:18414 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:57.952201 [INFO] agent: started state syncer
2019/12/30 18:55:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:58 [INFO]  raft: Node at 127.0.0.1:18418 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3f292d26-3001-12d4-ee07-4cdb094ad577 Address:127.0.0.1:18424}]
2019/12/30 18:55:58 [INFO]  raft: Node at 127.0.0.1:18424 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.035657 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.044881 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.046640 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:18424) (DC: dc1)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.047636 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.049786 [INFO] agent: Started DNS server 127.0.0.1:18419 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.049903 [INFO] agent: Started DNS server 127.0.0.1:18419 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.053815 [INFO] agent: Started HTTP server on 127.0.0.1:18420 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.053901 [INFO] agent: started state syncer
2019/12/30 18:55:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:58 [INFO]  raft: Node at 127.0.0.1:18424 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:55:58.111346 [DEBUG] consul: Skipping self join check for "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" since the cluster is too small
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:58.293217 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:55:58.293254 [INFO] agent: Synced node info
2019/12/30 18:55:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:58 [INFO]  raft: Node at 127.0.0.1:18418 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:58.722489 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:58.722890 [INFO] consul: New leader elected: Node bcdbbdc6-ce50-6df1-06c4-8469cef49998
2019/12/30 18:55:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:58 [INFO]  raft: Node at 127.0.0.1:18424 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.872710 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:58.873307 [INFO] consul: New leader elected: test-node
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.273041 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.273148 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:59.304246 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:55:59.304364 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.438514 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 703.018µs) from client 127.0.0.1:33935 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.445130 [DEBUG] dns: request for name bbad93c1-4509-387c-03f9-774efe3d5e9c.query.consul. type SRV class IN (took 960.359µs) from client 127.0.0.1:34255 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.446612 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.446695 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.446739 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.542627 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:55:59.577898 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.711479 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722244 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722316 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722451 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722516 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722569 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722671 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722732 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.722852 [INFO] agent: Stopping DNS server 127.0.0.1:18401 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.723018 [INFO] agent: Stopping DNS server 127.0.0.1:18401 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.723192 [INFO] agent: Stopping HTTP server 127.0.0.1:18402 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.723446 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/30 18:55:59.723515 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddressIPV6 (5.05s)
=== CONT  TestDNS_ReverseLookup_IPV6
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:55:59.831984 [WARN] agent: Node name "Node 58327229-af90-688d-7ea2-59ef6bb4caca" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:55:59.832586 [DEBUG] tlsutil: Update with version 1
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:55:59.834944 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.004101 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 759.02µs) from client 127.0.0.1:43431 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.005970 [DEBUG] dns: request for name 12c0a7a6-be58-790c-3259-7c3bc1f84c7e.query.consul. type SRV class IN (took 1.013694ms) from client 127.0.0.1:60639 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.008714 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.008807 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.008854 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.111243 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.200438 [DEBUG] agent: Node info in sync
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.200551 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.253467 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.254371 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.254516 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.254578 [INFO] agent: Stopping DNS server 127.0.0.1:18407 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.254726 [INFO] agent: Stopping DNS server 127.0.0.1:18407 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.254873 [INFO] agent: Stopping HTTP server 127.0.0.1:18408 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.255065 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.255130 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddress_CNAME (5.58s)
=== CONT  TestDNS_NSRecords_IPV6
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.256274 [ERR] connect: Apply failed raft is already shutdown
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/30 18:56:00.256332 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:00.319154 [DEBUG] tlsutil: Update with version 1
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:00.321348 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:56:00.444934 [DEBUG] consul: Skipping self join check for "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" since the cluster is too small
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.749779 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.749859 [INFO] consul: shutting down server
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.749906 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.750526 [DEBUG] dns: request for name alias2.service.consul. type SRV class IN (took 1.2397ms) from client 127.0.0.1:55541 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.859123 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:00.870655 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 821.022µs) from client 127.0.0.1:34440 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:00.872385 [DEBUG] dns: request for name 075b3558-3b2c-fb53-7003-1e0ba1faa5ed.query.consul. type SRV class IN (took 980.692µs) from client 127.0.0.1:50564 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:00.872507 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:00.872607 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:00.872687 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:00.959028 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.959124 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.961153 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.961620 [INFO] agent: consul server down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.961667 [INFO] agent: shutdown complete
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.961718 [INFO] agent: Stopping DNS server 127.0.0.1:18419 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.961843 [INFO] agent: Stopping DNS server 127.0.0.1:18419 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.961995 [INFO] agent: Stopping HTTP server 127.0.0.1:18420 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.962194 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/30 18:56:00.962256 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceToConsulCNAMENestedLookup (4.07s)
=== CONT  TestDNS_ExternalServiceToConsulCNAMELookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:01.064128 [WARN] agent: Node name "test node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:01.064575 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:01.066609 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.084230 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.084965 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.085038 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.085110 [INFO] agent: Stopping DNS server 127.0.0.1:18413 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.085288 [INFO] agent: Stopping DNS server 127.0.0.1:18413 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.085494 [INFO] agent: Stopping HTTP server 127.0.0.1:18414 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.085754 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.085840 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddress_A (4.31s)
=== CONT  TestDNS_InifiniteRecursion
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/30 18:56:01.086545 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_InifiniteRecursion - 2019/12/30 18:56:01.149866 [WARN] agent: Node name "test node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_InifiniteRecursion - 2019/12/30 18:56:01.150261 [DEBUG] tlsutil: Update with version 1
TestDNS_InifiniteRecursion - 2019/12/30 18:56:01.152347 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:58327229-af90-688d-7ea2-59ef6bb4caca Address:127.0.0.1:18430}]
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.188071 [INFO] serf: EventMemberJoin: Node 58327229-af90-688d-7ea2-59ef6bb4caca.dc1 127.0.0.1
2019/12/30 18:56:01 [INFO]  raft: Node at 127.0.0.1:18430 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.192079 [INFO] serf: EventMemberJoin: Node 58327229-af90-688d-7ea2-59ef6bb4caca 127.0.0.1
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.192789 [INFO] consul: Handled member-join event for server "Node 58327229-af90-688d-7ea2-59ef6bb4caca.dc1" in area "wan"
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.193112 [INFO] consul: Adding LAN server Node 58327229-af90-688d-7ea2-59ef6bb4caca (Addr: tcp/127.0.0.1:18430) (DC: dc1)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.193368 [INFO] agent: Started DNS server 127.0.0.1:18425 (udp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.193691 [INFO] agent: Started DNS server 127.0.0.1:18425 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.196198 [INFO] agent: Started HTTP server on 127.0.0.1:18426 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:01.196904 [INFO] agent: started state syncer
2019/12/30 18:56:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:01 [INFO]  raft: Node at 127.0.0.1:18430 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4fc37478-45a0-7832-3d95-f62797ea8386 Address:[::1]:18436}]
2019/12/30 18:56:01 [INFO]  raft: Node at [::1]:18436 [Follower] entering Follower state (Leader: "")
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.531154 [INFO] serf: EventMemberJoin: server1.dc1 ::1
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.537754 [INFO] serf: EventMemberJoin: server1 ::1
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.538840 [INFO] agent: Started DNS server 127.0.0.1:18431 (udp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.539211 [INFO] consul: Adding LAN server server1 (Addr: tcp/[::1]:18436) (DC: dc1)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.539438 [INFO] consul: Handled member-join event for server "server1.dc1" in area "wan"
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.539911 [INFO] agent: Started DNS server 127.0.0.1:18431 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.543587 [INFO] agent: Started HTTP server on 127.0.0.1:18432 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:01.543681 [INFO] agent: started state syncer
2019/12/30 18:56:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:01 [INFO]  raft: Node at [::1]:18436 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:02 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:18430 [Leader] entering Leader state
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:02.057647 [INFO] consul: cluster leadership acquired
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:02.058029 [INFO] consul: New leader elected: Node 58327229-af90-688d-7ea2-59ef6bb4caca
2019/12/30 18:56:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d8e6e781-5557-8992-f20e-e17704ebe2cb Address:127.0.0.1:18442}]
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:18442 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.339268 [INFO] serf: EventMemberJoin: test node.dc1 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.347816 [INFO] serf: EventMemberJoin: test node 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.348749 [INFO] consul: Adding LAN server test node (Addr: tcp/127.0.0.1:18442) (DC: dc1)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.349156 [INFO] consul: Handled member-join event for server "test node.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.350311 [INFO] agent: Started DNS server 127.0.0.1:18437 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.350389 [INFO] agent: Started DNS server 127.0.0.1:18437 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.352901 [INFO] agent: Started HTTP server on 127.0.0.1:18438 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:02.352992 [INFO] agent: started state syncer
2019/12/30 18:56:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:18442 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:02 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:02 [INFO]  raft: Node at [::1]:18436 [Leader] entering Leader state
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:02.469609 [INFO] consul: cluster leadership acquired
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:02.470119 [INFO] consul: New leader elected: server1
2019/12/30 18:56:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:06f84531-d471-86a1-9336-946ae1c15196 Address:127.0.0.1:18448}]
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:18448 [Follower] entering Follower state (Leader: "")
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.478793 [INFO] serf: EventMemberJoin: test node.dc1 127.0.0.1
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.487336 [INFO] serf: EventMemberJoin: test node 127.0.0.1
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.488218 [INFO] consul: Adding LAN server test node (Addr: tcp/127.0.0.1:18448) (DC: dc1)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.488754 [INFO] consul: Handled member-join event for server "test node.dc1" in area "wan"
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.489841 [INFO] agent: Started DNS server 127.0.0.1:18443 (tcp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.489941 [INFO] agent: Started DNS server 127.0.0.1:18443 (udp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.492304 [INFO] agent: Started HTTP server on 127.0.0.1:18444 (tcp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:02.492401 [INFO] agent: started state syncer
2019/12/30 18:56:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:18448 [Candidate] entering Candidate state in term 2
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:02.693265 [INFO] agent: Synced node info
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:02.693399 [DEBUG] agent: Node info in sync
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:03.194196 [INFO] agent: Synced node info
2019/12/30 18:56:03 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:03 [INFO]  raft: Node at 127.0.0.1:18442 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:03.304851 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:03.305393 [INFO] consul: New leader elected: test node
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.307314 [DEBUG] dns: request for {2.4.2.4.2.4.2.4.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa. 255 1} (727.353µs) from client 127.0.0.1:33116 (udp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.307438 [INFO] agent: Requesting shutdown
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.307517 [INFO] consul: shutting down server
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.307570 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:03 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:03 [INFO]  raft: Node at 127.0.0.1:18448 [Leader] entering Leader state
TestDNS_InifiniteRecursion - 2019/12/30 18:56:03.476528 [INFO] consul: cluster leadership acquired
TestDNS_InifiniteRecursion - 2019/12/30 18:56:03.476936 [INFO] consul: New leader elected: test node
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.477293 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.569227 [INFO] manager: shutting down
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.661184 [INFO] agent: consul server down
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.661274 [INFO] agent: shutdown complete
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.661337 [INFO] agent: Stopping DNS server 127.0.0.1:18425 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.661522 [INFO] agent: Stopping DNS server 127.0.0.1:18425 (udp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.661707 [INFO] agent: Stopping HTTP server 127.0.0.1:18426 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.661980 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.662058 [INFO] agent: Endpoints down
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.662499 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.662708 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.662769 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.662822 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ReverseLookup_IPV6 - 2019/12/30 18:56:03.662887 [ERR] consul: failed to transfer leadership in 3 attempts
--- PASS: TestDNS_ReverseLookup_IPV6 (3.94s)
=== CONT  TestDNS_ExternalServiceLookup
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:03.735360 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:03.783637 [WARN] agent: Node name "Node 2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:03.784156 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:03.786481 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_InifiniteRecursion - 2019/12/30 18:56:03.868389 [INFO] agent: Synced node info
TestDNS_InifiniteRecursion - 2019/12/30 18:56:03.868518 [DEBUG] agent: Node info in sync
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.327065 [ERR] dns: Infinite recursion detected for web.service.consul., won't perform any CNAME resolution.
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.327439 [DEBUG] dns: request for name web.service.consul. type A class IN (took 1.620709ms) from client 127.0.0.1:58967 (udp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.327556 [INFO] agent: Requesting shutdown
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.327636 [INFO] consul: shutting down server
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.327692 [WARN] serf: Shutdown without a Leave
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.409849 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.410277 [DEBUG] consul: Skipping self join check for "server1" since the cluster is too small
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.410439 [INFO] consul: member 'server1' joined, marking health alive
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.517489 [WARN] serf: Shutdown without a Leave
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.617539 [INFO] manager: shutting down
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.626802 [DEBUG] dns: request for name server1.node.dc1.consul. type NS class IN (took 862.023µs) from client 127.0.0.1:58953 (udp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.627058 [INFO] agent: Requesting shutdown
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.627145 [INFO] consul: shutting down server
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.627194 [WARN] serf: Shutdown without a Leave
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.717507 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.717786 [INFO] agent: consul server down
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.717840 [INFO] agent: shutdown complete
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.717898 [INFO] agent: Stopping DNS server 127.0.0.1:18443 (tcp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.718047 [INFO] agent: Stopping DNS server 127.0.0.1:18443 (udp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.718210 [INFO] agent: Stopping HTTP server 127.0.0.1:18444 (tcp)
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.718444 [INFO] agent: Waiting for endpoints to shut down
TestDNS_InifiniteRecursion - 2019/12/30 18:56:04.718512 [INFO] agent: Endpoints down
--- PASS: TestDNS_InifiniteRecursion (3.63s)
=== CONT  TestDNS_ConnectServiceLookup
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.785523 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:04.793871 [WARN] agent: Node name "Node 67578034-b646-81a0-05c5-4b1ca032b5b1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:04.794428 [DEBUG] tlsutil: Update with version 1
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:04.797232 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.909989 [INFO] manager: shutting down
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.910632 [INFO] agent: consul server down
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.910725 [INFO] agent: shutdown complete
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.910803 [INFO] agent: Stopping DNS server 127.0.0.1:18431 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.911040 [INFO] agent: Stopping DNS server 127.0.0.1:18431 (udp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.911284 [INFO] agent: Stopping HTTP server 127.0.0.1:18432 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.911556 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NSRecords_IPV6 - 2019/12/30 18:56:04.911651 [INFO] agent: Endpoints down
--- PASS: TestDNS_NSRecords_IPV6 (4.66s)
=== CONT  TestDNS_ServiceLookupWithInternalServiceAddress
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:04.961323 [DEBUG] agent: Node info in sync
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:04.961440 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:04.998090 [WARN] agent: Node name "my.test-node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:04.998725 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:05.001381 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf Address:127.0.0.1:18454}]
2019/12/30 18:56:05 [INFO]  raft: Node at 127.0.0.1:18454 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.185421 [INFO] serf: EventMemberJoin: Node 2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf.dc1 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.188452 [DEBUG] dns: request for name alias.service.consul. type SRV class IN (took 2.938411ms) from client 127.0.0.1:59877 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.191737 [DEBUG] dns: request for name alias.service.CoNsUl. type SRV class IN (took 1.149364ms) from client 127.0.0.1:59570 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.192168 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.192239 [INFO] consul: shutting down server
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.192338 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.194705 [INFO] serf: EventMemberJoin: Node 2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf 127.0.0.1
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.195639 [INFO] consul: Adding LAN server Node 2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf (Addr: tcp/127.0.0.1:18454) (DC: dc1)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.196206 [INFO] consul: Handled member-join event for server "Node 2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf.dc1" in area "wan"
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.197274 [INFO] agent: Started DNS server 127.0.0.1:18449 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.197617 [INFO] agent: Started DNS server 127.0.0.1:18449 (udp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.200062 [INFO] agent: Started HTTP server on 127.0.0.1:18450 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.200309 [INFO] agent: started state syncer
2019/12/30 18:56:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:05 [INFO]  raft: Node at 127.0.0.1:18454 [Candidate] entering Candidate state in term 2
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.342504 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.425948 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506187 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506274 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506469 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506473 [INFO] agent: consul server down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506588 [INFO] agent: shutdown complete
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506651 [INFO] agent: Stopping DNS server 127.0.0.1:18437 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.506814 [INFO] agent: Stopping DNS server 127.0.0.1:18437 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.507023 [INFO] agent: Stopping HTTP server 127.0.0.1:18438 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.507281 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/30 18:56:05.507366 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceToConsulCNAMELookup (4.54s)
=== CONT  TestDNS_ServiceLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup - 2019/12/30 18:56:05.565799 [WARN] agent: Node name "Node 3a00821b-3c98-ac3d-be40-a1ca7527603c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup - 2019/12/30 18:56:05.566217 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup - 2019/12/30 18:56:05.568349 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:05 [INFO]  raft: Node at 127.0.0.1:18454 [Leader] entering Leader state
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.766001 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:05.768457 [INFO] consul: New leader elected: Node 2d805706-6c9e-f2dc-18eb-ffb2f0b75fbf
2019/12/30 18:56:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:67578034-b646-81a0-05c5-4b1ca032b5b1 Address:127.0.0.1:18460}]
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.863738 [INFO] serf: EventMemberJoin: Node 67578034-b646-81a0-05c5-4b1ca032b5b1.dc1 127.0.0.1
2019/12/30 18:56:05 [INFO]  raft: Node at 127.0.0.1:18460 [Follower] entering Follower state (Leader: "")
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.870037 [INFO] serf: EventMemberJoin: Node 67578034-b646-81a0-05c5-4b1ca032b5b1 127.0.0.1
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.870795 [INFO] consul: Adding LAN server Node 67578034-b646-81a0-05c5-4b1ca032b5b1 (Addr: tcp/127.0.0.1:18460) (DC: dc1)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.870944 [INFO] consul: Handled member-join event for server "Node 67578034-b646-81a0-05c5-4b1ca032b5b1.dc1" in area "wan"
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.872248 [INFO] agent: Started DNS server 127.0.0.1:18455 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.872332 [INFO] agent: Started DNS server 127.0.0.1:18455 (udp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.874818 [INFO] agent: Started HTTP server on 127.0.0.1:18456 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:05.874932 [INFO] agent: started state syncer
2019/12/30 18:56:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:05 [INFO]  raft: Node at 127.0.0.1:18460 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9d313723-df4a-886a-dc70-0896e972598d Address:127.0.0.1:18466}]
2019/12/30 18:56:06 [INFO]  raft: Node at 127.0.0.1:18466 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.174714 [INFO] agent: Synced node info
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.176048 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.177551 [INFO] serf: EventMemberJoin: my.test-node.dc1 127.0.0.1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.182068 [INFO] serf: EventMemberJoin: my.test-node 127.0.0.1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.182590 [INFO] consul: Handled member-join event for server "my.test-node.dc1" in area "wan"
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.182753 [INFO] consul: Adding LAN server my.test-node (Addr: tcp/127.0.0.1:18466) (DC: dc1)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.183242 [INFO] agent: Started DNS server 127.0.0.1:18461 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.183306 [INFO] agent: Started DNS server 127.0.0.1:18461 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.185827 [INFO] agent: Started HTTP server on 127.0.0.1:18462 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.186160 [INFO] agent: started state syncer
2019/12/30 18:56:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:06 [INFO]  raft: Node at 127.0.0.1:18466 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:56:06.247969 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:56:06.248052 [DEBUG] agent: Node info in sync
2019/12/30 18:56:06 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:06 [INFO]  raft: Node at 127.0.0.1:18460 [Leader] entering Leader state
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:06.461497 [INFO] consul: cluster leadership acquired
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:06.462138 [INFO] consul: New leader elected: Node 67578034-b646-81a0-05c5-4b1ca032b5b1
2019/12/30 18:56:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3a00821b-3c98-ac3d-be40-a1ca7527603c Address:127.0.0.1:18472}]
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.545784 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 915.024µs) from client 127.0.0.1:59271 (udp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.546063 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.546143 [INFO] consul: shutting down server
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.546191 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup - 2019/12/30 18:56:06.547265 [INFO] serf: EventMemberJoin: Node 3a00821b-3c98-ac3d-be40-a1ca7527603c.dc1 127.0.0.1
2019/12/30 18:56:06 [INFO]  raft: Node at 127.0.0.1:18472 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup - 2019/12/30 18:56:06.557857 [INFO] serf: EventMemberJoin: Node 3a00821b-3c98-ac3d-be40-a1ca7527603c 127.0.0.1
TestDNS_ServiceLookup - 2019/12/30 18:56:06.559112 [INFO] consul: Adding LAN server Node 3a00821b-3c98-ac3d-be40-a1ca7527603c (Addr: tcp/127.0.0.1:18472) (DC: dc1)
TestDNS_ServiceLookup - 2019/12/30 18:56:06.559226 [INFO] consul: Handled member-join event for server "Node 3a00821b-3c98-ac3d-be40-a1ca7527603c.dc1" in area "wan"
TestDNS_ServiceLookup - 2019/12/30 18:56:06.560346 [INFO] agent: Started DNS server 127.0.0.1:18467 (tcp)
TestDNS_ServiceLookup - 2019/12/30 18:56:06.560429 [INFO] agent: Started DNS server 127.0.0.1:18467 (udp)
TestDNS_ServiceLookup - 2019/12/30 18:56:06.562738 [INFO] agent: Started HTTP server on 127.0.0.1:18468 (tcp)
TestDNS_ServiceLookup - 2019/12/30 18:56:06.562839 [INFO] agent: started state syncer
2019/12/30 18:56:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:06 [INFO]  raft: Node at 127.0.0.1:18472 [Candidate] entering Candidate state in term 2
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.684157 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.759255 [INFO] manager: shutting down
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.851200 [INFO] agent: consul server down
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.851287 [INFO] agent: shutdown complete
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.851361 [INFO] agent: Stopping DNS server 127.0.0.1:18449 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.851549 [INFO] agent: Stopping DNS server 127.0.0.1:18449 (udp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.851739 [INFO] agent: Stopping HTTP server 127.0.0.1:18450 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.851982 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.852074 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceLookup (3.19s)
=== CONT  TestDNS_ServiceLookupMultiAddrNoCNAME
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:06.855540 [INFO] agent: Synced node info
2019/12/30 18:56:06 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:06 [INFO]  raft: Node at 127.0.0.1:18466 [Leader] entering Leader state
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.867044 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ExternalServiceLookup - 2019/12/30 18:56:06.867363 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.867514 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:06.867827 [INFO] consul: New leader elected: my.test-node
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:06.920684 [WARN] agent: Node name "Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:06.921140 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:06.923127 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:07 [INFO]  raft: Node at 127.0.0.1:18472 [Leader] entering Leader state
TestDNS_ServiceLookup - 2019/12/30 18:56:07.130420 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup - 2019/12/30 18:56:07.130847 [INFO] consul: New leader elected: Node 3a00821b-3c98-ac3d-be40-a1ca7527603c
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.334127 [DEBUG] dns: request for name db.connect.consul. type SRV class IN (took 899.357µs) from client 127.0.0.1:48720 (udp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.334789 [INFO] agent: Requesting shutdown
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.334859 [INFO] consul: shutting down server
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.334900 [WARN] serf: Shutdown without a Leave
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.404254 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup - 2019/12/30 18:56:07.518406 [INFO] agent: Synced node info
TestDNS_ServiceLookup - 2019/12/30 18:56:07.518551 [DEBUG] agent: Node info in sync
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.550631 [DEBUG] agent: Node info in sync
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.550758 [DEBUG] agent: Node info in sync
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:07.675936 [INFO] manager: shutting down
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:07.679690 [INFO] agent: Synced node info
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:07.679811 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:07.779223 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup - 2019/12/30 18:56:07.886579 [DEBUG] agent: Node info in sync
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.693012 [INFO] agent: consul server down
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.693109 [INFO] agent: shutdown complete
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.693189 [INFO] agent: Stopping DNS server 127.0.0.1:18455 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.693387 [INFO] agent: Stopping DNS server 127.0.0.1:18455 (udp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.693590 [INFO] agent: Stopping HTTP server 127.0.0.1:18456 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.693913 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.694005 [INFO] agent: Endpoints down
--- PASS: TestDNS_ConnectServiceLookup (3.98s)
=== CONT  TestDNS_ServiceLookupPreferNoCNAME
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.694253 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.694542 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ConnectServiceLookup - 2019/12/30 18:56:08.694644 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:08.807003 [WARN] agent: Node name "Node b1b7a894-e19a-6947-6f6a-9fdd627a8ca4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:08.807425 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:08.810042 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.071887 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.255366ms) from client 127.0.0.1:45383 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.073313 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.073400 [INFO] consul: shutting down server
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.073466 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.152863 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3d416b43-648f-2731-83f3-9d4c5afdb4b2 Address:127.0.0.1:18478}]
2019/12/30 18:56:09 [INFO]  raft: Node at 127.0.0.1:18478 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.163682 [INFO] serf: EventMemberJoin: Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2.dc1 127.0.0.1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.170952 [INFO] serf: EventMemberJoin: Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2 127.0.0.1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.175149 [INFO] consul: Adding LAN server Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2 (Addr: tcp/127.0.0.1:18478) (DC: dc1)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.175454 [INFO] consul: Handled member-join event for server "Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2.dc1" in area "wan"
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.175985 [INFO] agent: Started DNS server 127.0.0.1:18473 (udp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.176076 [INFO] agent: Started DNS server 127.0.0.1:18473 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.188738 [INFO] agent: Started HTTP server on 127.0.0.1:18474 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:09.188857 [INFO] agent: started state syncer
2019/12/30 18:56:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:09 [INFO]  raft: Node at 127.0.0.1:18478 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.292674 [INFO] manager: shutting down
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.526373 [INFO] agent: consul server down
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.526474 [INFO] agent: shutdown complete
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.526550 [INFO] agent: Stopping DNS server 127.0.0.1:18461 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.526731 [INFO] agent: Stopping DNS server 127.0.0.1:18461 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.526962 [INFO] agent: Stopping HTTP server 127.0.0.1:18462 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.527242 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.527351 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookupWithInternalServiceAddress (4.62s)
=== CONT  TestDNS_ServiceReverseLookupNodeAddress
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/30 18:56:09.528001 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup - 2019/12/30 18:56:09.530881 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 929.024µs) from client 127.0.0.1:58951 (udp)
TestDNS_ServiceLookup - 2019/12/30 18:56:09.542914 [DEBUG] dns: request for name 3bac1178-f8de-09ae-79bb-bbbdf3b4eb95.query.consul. type SRV class IN (took 10.832619ms) from client 127.0.0.1:34956 (udp)
TestDNS_ServiceLookup - 2019/12/30 18:56:09.543341 [DEBUG] dns: request for name nodb.service.consul. type SRV class IN (took 1.349035ms) from client 127.0.0.1:35285 (udp)
TestDNS_ServiceLookup - 2019/12/30 18:56:09.544486 [DEBUG] dns: request for name nope.query.consul. type SRV class IN (took 469.679µs) from client 127.0.0.1:48028 (udp)
TestDNS_ServiceLookup - 2019/12/30 18:56:09.544589 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup - 2019/12/30 18:56:09.544664 [INFO] consul: shutting down server
TestDNS_ServiceLookup - 2019/12/30 18:56:09.544721 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:09.631737 [WARN] agent: Node name "Node 517178b3-5dc0-dee8-9a1e-15b2163b1360" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:09.632329 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:09.634822 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup - 2019/12/30 18:56:09.810478 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup - 2019/12/30 18:56:09.935689 [INFO] manager: shutting down
2019/12/30 18:56:10 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:10 [INFO]  raft: Node at 127.0.0.1:18478 [Leader] entering Leader state
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:10.126431 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:10.126899 [INFO] consul: New leader elected: Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2
TestDNS_ServiceLookup - 2019/12/30 18:56:10.127314 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_ServiceLookup - 2019/12/30 18:56:10.127853 [INFO] agent: consul server down
TestDNS_ServiceLookup - 2019/12/30 18:56:10.128033 [INFO] agent: shutdown complete
TestDNS_ServiceLookup - 2019/12/30 18:56:10.128162 [INFO] agent: Stopping DNS server 127.0.0.1:18467 (tcp)
TestDNS_ServiceLookup - 2019/12/30 18:56:10.128485 [INFO] agent: Stopping DNS server 127.0.0.1:18467 (udp)
TestDNS_ServiceLookup - 2019/12/30 18:56:10.128845 [INFO] agent: Stopping HTTP server 127.0.0.1:18468 (tcp)
TestDNS_ServiceLookup - 2019/12/30 18:56:10.129238 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup - 2019/12/30 18:56:10.129484 [INFO] agent: Endpoints down
=== CONT  TestDNS_SOA_Settings
--- PASS: TestDNS_ServiceLookup (4.62s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/30 18:56:10.186439 [WARN] agent: Node name "Node 21bd93dc-f5c8-ea77-367f-05a6873a01d6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/30 18:56:10.186843 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/30 18:56:10.189098 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b1b7a894-e19a-6947-6f6a-9fdd627a8ca4 Address:127.0.0.1:18484}]
2019/12/30 18:56:10 [INFO]  raft: Node at 127.0.0.1:18484 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.398715 [INFO] serf: EventMemberJoin: Node b1b7a894-e19a-6947-6f6a-9fdd627a8ca4.dc1 127.0.0.1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.407887 [INFO] serf: EventMemberJoin: Node b1b7a894-e19a-6947-6f6a-9fdd627a8ca4 127.0.0.1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.409376 [INFO] consul: Adding LAN server Node b1b7a894-e19a-6947-6f6a-9fdd627a8ca4 (Addr: tcp/127.0.0.1:18484) (DC: dc1)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.410179 [INFO] consul: Handled member-join event for server "Node b1b7a894-e19a-6947-6f6a-9fdd627a8ca4.dc1" in area "wan"
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.411968 [INFO] agent: Started DNS server 127.0.0.1:18479 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.412467 [INFO] agent: Started DNS server 127.0.0.1:18479 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.414846 [INFO] agent: Started HTTP server on 127.0.0.1:18480 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:10.414951 [INFO] agent: started state syncer
2019/12/30 18:56:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:10 [INFO]  raft: Node at 127.0.0.1:18484 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:10.896847 [INFO] agent: Synced node info
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:11.343129 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:11.343237 [DEBUG] agent: Node info in sync
2019/12/30 18:56:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:11 [INFO]  raft: Node at 127.0.0.1:18484 [Leader] entering Leader state
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:11.376689 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:11.377188 [INFO] consul: New leader elected: Node b1b7a894-e19a-6947-6f6a-9fdd627a8ca4
2019/12/30 18:56:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:517178b3-5dc0-dee8-9a1e-15b2163b1360 Address:127.0.0.1:18490}]
2019/12/30 18:56:11 [INFO]  raft: Node at 127.0.0.1:18490 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.382005 [INFO] serf: EventMemberJoin: Node 517178b3-5dc0-dee8-9a1e-15b2163b1360.dc1 127.0.0.1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.399603 [INFO] serf: EventMemberJoin: Node 517178b3-5dc0-dee8-9a1e-15b2163b1360 127.0.0.1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.400425 [INFO] consul: Handled member-join event for server "Node 517178b3-5dc0-dee8-9a1e-15b2163b1360.dc1" in area "wan"
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.400770 [INFO] consul: Adding LAN server Node 517178b3-5dc0-dee8-9a1e-15b2163b1360 (Addr: tcp/127.0.0.1:18490) (DC: dc1)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.401026 [INFO] agent: Started DNS server 127.0.0.1:18485 (udp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.401489 [INFO] agent: Started DNS server 127.0.0.1:18485 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.403818 [INFO] agent: Started HTTP server on 127.0.0.1:18486 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:11.403921 [INFO] agent: started state syncer
2019/12/30 18:56:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:11 [INFO]  raft: Node at 127.0.0.1:18490 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:11.860546 [INFO] agent: Synced node info
2019/12/30 18:56:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:21bd93dc-f5c8-ea77-367f-05a6873a01d6 Address:127.0.0.1:18496}]
TestDNS_SOA_Settings - 2019/12/30 18:56:11.866565 [INFO] serf: EventMemberJoin: Node 21bd93dc-f5c8-ea77-367f-05a6873a01d6.dc1 127.0.0.1
2019/12/30 18:56:11 [INFO]  raft: Node at 127.0.0.1:18496 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/30 18:56:11.873586 [INFO] serf: EventMemberJoin: Node 21bd93dc-f5c8-ea77-367f-05a6873a01d6 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:11.874890 [INFO] consul: Adding LAN server Node 21bd93dc-f5c8-ea77-367f-05a6873a01d6 (Addr: tcp/127.0.0.1:18496) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/30 18:56:11.875398 [INFO] consul: Handled member-join event for server "Node 21bd93dc-f5c8-ea77-367f-05a6873a01d6.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/30 18:56:11.878049 [INFO] agent: Started DNS server 127.0.0.1:18491 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:11.878552 [INFO] agent: Started DNS server 127.0.0.1:18491 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:11.887999 [INFO] agent: Started HTTP server on 127.0.0.1:18492 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:11.888093 [INFO] agent: started state syncer
2019/12/30 18:56:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:11 [INFO]  raft: Node at 127.0.0.1:18496 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:12 [INFO]  raft: Node at 127.0.0.1:18490 [Leader] entering Leader state
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:12.301775 [INFO] consul: cluster leadership acquired
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:12.302202 [INFO] consul: New leader elected: Node 517178b3-5dc0-dee8-9a1e-15b2163b1360
jones - 2019/12/30 18:56:12.383720 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:56:12.383811 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/30 18:56:12.383845 [DEBUG] agent: Node info in sync
2019/12/30 18:56:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:12 [INFO]  raft: Node at 127.0.0.1:18496 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/30 18:56:12.752344 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/30 18:56:12.752801 [INFO] consul: New leader elected: Node 21bd93dc-f5c8-ea77-367f-05a6873a01d6
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:12.927970 [INFO] agent: Synced node info
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:12.928096 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.029910 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.371369ms) from client 127.0.0.1:59162 (udp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.031257 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.031357 [INFO] consul: shutting down server
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.031409 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.154130 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.132696ms) from client 127.0.0.1:51603 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.154643 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.154743 [INFO] consul: shutting down server
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.154791 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.259363 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.260434 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.376048 [INFO] manager: shutting down
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.376453 [INFO] agent: consul server down
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.376506 [INFO] agent: shutdown complete
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.376565 [INFO] agent: Stopping DNS server 127.0.0.1:18479 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.376721 [INFO] agent: Stopping DNS server 127.0.0.1:18479 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.376889 [INFO] agent: Stopping HTTP server 127.0.0.1:18480 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.377109 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:13.377184 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookupPreferNoCNAME (4.68s)
=== CONT  TestDNS_ServiceReverseLookup_CustomDomain
TestDNS_SOA_Settings - 2019/12/30 18:56:13.377357 [INFO] agent: Synced node info
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.379933 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.381779 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.382236 [DEBUG] consul: Skipping self join check for "Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2" since the cluster is too small
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.382396 [INFO] consul: member 'Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2' joined, marking health alive
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.384111 [INFO] manager: shutting down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.387366 [WARN] consul: error getting server health from "Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2": rpc error making call: EOF
TestDNS_SOA_Settings - 2019/12/30 18:56:13.391212 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 579.349µs) from client 127.0.0.1:45158 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:13.391800 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/30 18:56:13.391916 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/30 18:56:13.391966 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/30 18:56:13.494674 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:13.520113 [WARN] agent: Node name "Node df0e3ae5-06a2-fb09-20de-7f308779d9f9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:13.525676 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:13.528391 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.601279 [INFO] agent: consul server down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.601356 [INFO] agent: shutdown complete
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.601426 [INFO] agent: Stopping DNS server 127.0.0.1:18473 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:13.601554 [INFO] manager: shutting down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.601600 [INFO] agent: Stopping DNS server 127.0.0.1:18473 (udp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.601820 [INFO] agent: Stopping HTTP server 127.0.0.1:18474 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.601997 [ERR] consul: failed to reconcile member: {Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2 127.0.0.1 18476 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:3d416b43-648f-2731-83f3-9d4c5afdb4b2 port:18478 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:18477] alive 1 5 2 2 5 4}: leadership lost while committing log
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.602155 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:13.602265 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookupMultiAddrNoCNAME (6.75s)
=== CONT  TestDNS_ServiceReverseLookup_IPV6
TestDNS_SOA_Settings - 2019/12/30 18:56:13.604603 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_SOA_Settings - 2019/12/30 18:56:13.604848 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/30 18:56:13.604898 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/30 18:56:13.604934 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/30 18:56:13.604945 [INFO] agent: Stopping DNS server 127.0.0.1:18491 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:13.605147 [INFO] agent: Stopping DNS server 127.0.0.1:18491 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:13.605292 [INFO] agent: Stopping HTTP server 127.0.0.1:18492 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:13.605493 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/30 18:56:13.605561 [INFO] agent: Endpoints down
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:13.774775 [WARN] agent: Node name "Node f68adf87-9c70-265f-8e45-f632b04e86a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:13.777224 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:13.783245 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/30 18:56:13.786870 [WARN] agent: Node name "Node 2e2852ff-8ac1-2139-418d-1c28377ad211" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/30 18:56:13.790528 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/30 18:56:13.793326 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:13.912083 [DEBUG] dns: request for {1.0.0.127.in-addr.arpa. 255 1} (527.347µs) from client 127.0.0.1:38281 (udp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:13.912384 [INFO] agent: Requesting shutdown
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:13.912453 [INFO] consul: shutting down server
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:13.912501 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:13.993882 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.087646 [INFO] manager: shutting down
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088123 [INFO] agent: consul server down
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088191 [INFO] agent: shutdown complete
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088247 [INFO] agent: Stopping DNS server 127.0.0.1:18485 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088395 [INFO] agent: Stopping DNS server 127.0.0.1:18485 (udp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088550 [INFO] agent: Stopping HTTP server 127.0.0.1:18486 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088760 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.088831 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceReverseLookupNodeAddress (4.56s)
=== CONT  TestDNS_ReverseLookup_CustomDomain
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.103460 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/30 18:56:14.103704 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:14.159168 [WARN] agent: Node name "Node 9fd7f14b-2206-4a08-931f-f3d220c7ee15" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:14.159622 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:14.162447 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:14.163282 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/30 18:56:14.380027 [WARN] consul: error getting server health from "Node 3d416b43-648f-2731-83f3-9d4c5afdb4b2": context deadline exceeded
2019/12/30 18:56:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:df0e3ae5-06a2-fb09-20de-7f308779d9f9 Address:127.0.0.1:18502}]
2019/12/30 18:56:14 [INFO]  raft: Node at 127.0.0.1:18502 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.872993 [INFO] serf: EventMemberJoin: Node df0e3ae5-06a2-fb09-20de-7f308779d9f9.dc1 127.0.0.1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.876897 [INFO] serf: EventMemberJoin: Node df0e3ae5-06a2-fb09-20de-7f308779d9f9 127.0.0.1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.878004 [INFO] consul: Adding LAN server Node df0e3ae5-06a2-fb09-20de-7f308779d9f9 (Addr: tcp/127.0.0.1:18502) (DC: dc1)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.878592 [INFO] consul: Handled member-join event for server "Node df0e3ae5-06a2-fb09-20de-7f308779d9f9.dc1" in area "wan"
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.879834 [INFO] agent: Started DNS server 127.0.0.1:18497 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.880226 [INFO] agent: Started DNS server 127.0.0.1:18497 (udp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.882661 [INFO] agent: Started HTTP server on 127.0.0.1:18498 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:14.882935 [INFO] agent: started state syncer
2019/12/30 18:56:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:14 [INFO]  raft: Node at 127.0.0.1:18502 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2e2852ff-8ac1-2139-418d-1c28377ad211 Address:127.0.0.1:18514}]
2019/12/30 18:56:14 [INFO]  raft: Node at 127.0.0.1:18514 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/30 18:56:14.989609 [INFO] serf: EventMemberJoin: Node 2e2852ff-8ac1-2139-418d-1c28377ad211.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:14.993144 [INFO] serf: EventMemberJoin: Node 2e2852ff-8ac1-2139-418d-1c28377ad211 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:14.993728 [INFO] consul: Adding LAN server Node 2e2852ff-8ac1-2139-418d-1c28377ad211 (Addr: tcp/127.0.0.1:18514) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/30 18:56:14.993984 [INFO] consul: Handled member-join event for server "Node 2e2852ff-8ac1-2139-418d-1c28377ad211.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/30 18:56:14.994259 [INFO] agent: Started DNS server 127.0.0.1:18509 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:14.994352 [INFO] agent: Started DNS server 127.0.0.1:18509 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:14.997229 [INFO] agent: Started HTTP server on 127.0.0.1:18510 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:14.997354 [INFO] agent: started state syncer
2019/12/30 18:56:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18514 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f68adf87-9c70-265f-8e45-f632b04e86a9 Address:127.0.0.1:18508}]
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18508 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.170521 [INFO] serf: EventMemberJoin: Node f68adf87-9c70-265f-8e45-f632b04e86a9.dc1 127.0.0.1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.179879 [INFO] serf: EventMemberJoin: Node f68adf87-9c70-265f-8e45-f632b04e86a9 127.0.0.1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.181585 [INFO] consul: Adding LAN server Node f68adf87-9c70-265f-8e45-f632b04e86a9 (Addr: tcp/127.0.0.1:18508) (DC: dc1)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.182547 [INFO] consul: Handled member-join event for server "Node f68adf87-9c70-265f-8e45-f632b04e86a9.dc1" in area "wan"
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.184501 [INFO] agent: Started DNS server 127.0.0.1:18503 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.184922 [INFO] agent: Started DNS server 127.0.0.1:18503 (udp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.188200 [INFO] agent: Started HTTP server on 127.0.0.1:18504 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:15.188309 [INFO] agent: started state syncer
2019/12/30 18:56:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18508 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9fd7f14b-2206-4a08-931f-f3d220c7ee15 Address:127.0.0.1:18520}]
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.432928 [INFO] serf: EventMemberJoin: Node 9fd7f14b-2206-4a08-931f-f3d220c7ee15.dc1 127.0.0.1
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18520 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.440106 [INFO] serf: EventMemberJoin: Node 9fd7f14b-2206-4a08-931f-f3d220c7ee15 127.0.0.1
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.442171 [INFO] consul: Adding LAN server Node 9fd7f14b-2206-4a08-931f-f3d220c7ee15 (Addr: tcp/127.0.0.1:18520) (DC: dc1)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.443140 [INFO] consul: Handled member-join event for server "Node 9fd7f14b-2206-4a08-931f-f3d220c7ee15.dc1" in area "wan"
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.444814 [INFO] agent: Started DNS server 127.0.0.1:18515 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.445624 [INFO] agent: Started DNS server 127.0.0.1:18515 (udp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.448292 [INFO] agent: Started HTTP server on 127.0.0.1:18516 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:15.448416 [INFO] agent: started state syncer
2019/12/30 18:56:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18520 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18502 [Leader] entering Leader state
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:15.735126 [INFO] consul: cluster leadership acquired
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:15.735822 [INFO] consul: New leader elected: Node df0e3ae5-06a2-fb09-20de-7f308779d9f9
2019/12/30 18:56:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:15 [INFO]  raft: Node at 127.0.0.1:18514 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/30 18:56:16.003108 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/30 18:56:16.003552 [INFO] consul: New leader elected: Node 2e2852ff-8ac1-2139-418d-1c28377ad211
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/30 18:56:16.162401 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
2019/12/30 18:56:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:16 [INFO]  raft: Node at 127.0.0.1:18508 [Leader] entering Leader state
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:16.189927 [INFO] consul: cluster leadership acquired
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:16.190340 [INFO] consul: New leader elected: Node f68adf87-9c70-265f-8e45-f632b04e86a9
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.352222 [INFO] agent: Synced node info
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.352385 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/30 18:56:16.427089 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/30 18:56:16.427238 [DEBUG] agent: Node info in sync
2019/12/30 18:56:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:16 [INFO]  raft: Node at 127.0.0.1:18520 [Leader] entering Leader state
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:16.430621 [INFO] consul: cluster leadership acquired
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:16.431057 [INFO] consul: New leader elected: Node 9fd7f14b-2206-4a08-931f-f3d220c7ee15
TestDNS_SOA_Settings - 2019/12/30 18:56:16.433644 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 482.346µs) from client 127.0.0.1:39700 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:16.434300 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/30 18:56:16.434432 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/30 18:56:16.434489 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/30 18:56:16.642876 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:16.644571 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/30 18:56:16.734551 [INFO] manager: shutting down
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735189 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735253 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735325 [INFO] agent: Stopping DNS server 127.0.0.1:18509 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735530 [INFO] agent: Stopping DNS server 127.0.0.1:18509 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735751 [INFO] agent: Stopping HTTP server 127.0.0.1:18510 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735857 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/30 18:56:16.735973 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/30 18:56:16.736047 [INFO] agent: Endpoints down
TestDNS_SOA_Settings - 2019/12/30 18:56:16.736066 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.740626 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (756.354µs) from client 127.0.0.1:53360 (udp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.741206 [INFO] agent: Requesting shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.741290 [INFO] consul: shutting down server
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.741337 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.809350 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/30 18:56:16.878126 [WARN] agent: Node name "Node 7e9cf7b1-1e4e-bb79-7990-f234175a6471" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/30 18:56:16.880625 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/30 18:56:16.884885 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:16.942841 [INFO] manager: shutting down
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:16.943752 [INFO] agent: Synced node info
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:16.943858 [DEBUG] agent: Node info in sync
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.118161 [INFO] agent: consul server down
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.118260 [INFO] agent: shutdown complete
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.118351 [INFO] agent: Stopping DNS server 127.0.0.1:18497 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.118558 [INFO] agent: Stopping DNS server 127.0.0.1:18497 (udp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.118753 [INFO] agent: Stopping HTTP server 127.0.0.1:18498 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.119009 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.119100 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceReverseLookup_CustomDomain (3.74s)
=== CONT  TestDNS_ReverseLookup
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.119986 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.120211 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.120280 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.120333 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/30 18:56:17.120384 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReverseLookup - 2019/12/30 18:56:17.282334 [WARN] agent: Node name "Node 76c72ad0-6aff-966e-e470-211aa23eb111" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReverseLookup - 2019/12/30 18:56:17.283534 [DEBUG] tlsutil: Update with version 1
TestDNS_ReverseLookup - 2019/12/30 18:56:17.291366 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:17.504173 [DEBUG] agent: Node info in sync
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:17.765888 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (581.016µs) from client 127.0.0.1:49126 (udp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:17.768087 [INFO] agent: Requesting shutdown
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:17.768173 [INFO] consul: shutting down server
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:17.768223 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.005225 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.010788 [DEBUG] dns: request for {9.2.3.8.2.4.0.0.0.0.f.f.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.ip6.arpa. 255 1} (888.691µs) from client 127.0.0.1:58712 (udp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.010890 [INFO] agent: Requesting shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.010963 [INFO] consul: shutting down server
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.011009 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.111261 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.112014 [INFO] manager: shutting down
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.192846 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.192896 [INFO] manager: shutting down
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193117 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193183 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.193196 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193135 [INFO] agent: consul server down
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193270 [INFO] agent: shutdown complete
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193325 [INFO] agent: Stopping DNS server 127.0.0.1:18515 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193474 [INFO] agent: Stopping DNS server 127.0.0.1:18515 (udp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.193499 [INFO] agent: consul server down
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.193605 [INFO] agent: shutdown complete
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193643 [INFO] agent: Stopping HTTP server 127.0.0.1:18516 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.193763 [INFO] agent: Stopping DNS server 127.0.0.1:18503 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193840 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.193901 [INFO] agent: Stopping DNS server 127.0.0.1:18503 (udp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/30 18:56:18.193905 [INFO] agent: Endpoints down
--- PASS: TestDNS_ReverseLookup_CustomDomain (4.11s)
=== CONT  TestDNS_EDNS0_ECS
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.194049 [INFO] agent: Stopping HTTP server 127.0.0.1:18504 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.194694 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/30 18:56:18.194774 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceReverseLookup_IPV6 (4.59s)
=== CONT  TestDNS_EDNS0
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_EDNS0 - 2019/12/30 18:56:18.269737 [WARN] agent: Node name "Node 83a04d9a-8fca-3ee3-2625-983de54bbef7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_EDNS0 - 2019/12/30 18:56:18.270194 [DEBUG] tlsutil: Update with version 1
TestDNS_EDNS0 - 2019/12/30 18:56:18.273150 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_EDNS0_ECS - 2019/12/30 18:56:18.292234 [WARN] agent: Node name "Node 4c60e313-85bd-0353-1376-f5794edb2b2d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_EDNS0_ECS - 2019/12/30 18:56:18.298551 [DEBUG] tlsutil: Update with version 1
TestDNS_EDNS0_ECS - 2019/12/30 18:56:18.301045 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7e9cf7b1-1e4e-bb79-7990-f234175a6471 Address:127.0.0.1:18526}]
2019/12/30 18:56:18 [INFO]  raft: Node at 127.0.0.1:18526 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/30 18:56:18.564616 [INFO] serf: EventMemberJoin: Node 7e9cf7b1-1e4e-bb79-7990-f234175a6471.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:18.568112 [INFO] serf: EventMemberJoin: Node 7e9cf7b1-1e4e-bb79-7990-f234175a6471 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:18.569043 [INFO] consul: Adding LAN server Node 7e9cf7b1-1e4e-bb79-7990-f234175a6471 (Addr: tcp/127.0.0.1:18526) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/30 18:56:18.569370 [INFO] consul: Handled member-join event for server "Node 7e9cf7b1-1e4e-bb79-7990-f234175a6471.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/30 18:56:18.569558 [INFO] agent: Started DNS server 127.0.0.1:18521 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:18.569944 [INFO] agent: Started DNS server 127.0.0.1:18521 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:18.572329 [INFO] agent: Started HTTP server on 127.0.0.1:18522 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:18.572661 [INFO] agent: started state syncer
2019/12/30 18:56:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:18 [INFO]  raft: Node at 127.0.0.1:18526 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:76c72ad0-6aff-966e-e470-211aa23eb111 Address:127.0.0.1:18532}]
TestDNS_ReverseLookup - 2019/12/30 18:56:18.674108 [INFO] serf: EventMemberJoin: Node 76c72ad0-6aff-966e-e470-211aa23eb111.dc1 127.0.0.1
TestDNS_ReverseLookup - 2019/12/30 18:56:18.678085 [INFO] serf: EventMemberJoin: Node 76c72ad0-6aff-966e-e470-211aa23eb111 127.0.0.1
2019/12/30 18:56:18 [INFO]  raft: Node at 127.0.0.1:18532 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup - 2019/12/30 18:56:18.680191 [INFO] consul: Handled member-join event for server "Node 76c72ad0-6aff-966e-e470-211aa23eb111.dc1" in area "wan"
TestDNS_ReverseLookup - 2019/12/30 18:56:18.680315 [INFO] consul: Adding LAN server Node 76c72ad0-6aff-966e-e470-211aa23eb111 (Addr: tcp/127.0.0.1:18532) (DC: dc1)
TestDNS_ReverseLookup - 2019/12/30 18:56:18.680764 [INFO] agent: Started DNS server 127.0.0.1:18527 (udp)
TestDNS_ReverseLookup - 2019/12/30 18:56:18.680848 [INFO] agent: Started DNS server 127.0.0.1:18527 (tcp)
TestDNS_ReverseLookup - 2019/12/30 18:56:18.683226 [INFO] agent: Started HTTP server on 127.0.0.1:18528 (tcp)
TestDNS_ReverseLookup - 2019/12/30 18:56:18.683337 [INFO] agent: started state syncer
2019/12/30 18:56:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:18 [INFO]  raft: Node at 127.0.0.1:18532 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:19 [INFO]  raft: Node at 127.0.0.1:18526 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/30 18:56:19.219026 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/30 18:56:19.219607 [INFO] consul: New leader elected: Node 7e9cf7b1-1e4e-bb79-7990-f234175a6471
2019/12/30 18:56:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:19 [INFO]  raft: Node at 127.0.0.1:18532 [Leader] entering Leader state
TestDNS_ReverseLookup - 2019/12/30 18:56:19.320547 [INFO] consul: cluster leadership acquired
TestDNS_ReverseLookup - 2019/12/30 18:56:19.320984 [INFO] consul: New leader elected: Node 76c72ad0-6aff-966e-e470-211aa23eb111
2019/12/30 18:56:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4c60e313-85bd-0353-1376-f5794edb2b2d Address:127.0.0.1:18538}]
2019/12/30 18:56:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:83a04d9a-8fca-3ee3-2625-983de54bbef7 Address:127.0.0.1:18544}]
2019/12/30 18:56:19 [INFO]  raft: Node at 127.0.0.1:18538 [Follower] entering Follower state (Leader: "")
2019/12/30 18:56:19 [INFO]  raft: Node at 127.0.0.1:18544 [Follower] entering Follower state (Leader: "")
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.407108 [INFO] serf: EventMemberJoin: Node 4c60e313-85bd-0353-1376-f5794edb2b2d.dc1 127.0.0.1
TestDNS_EDNS0 - 2019/12/30 18:56:19.407122 [INFO] serf: EventMemberJoin: Node 83a04d9a-8fca-3ee3-2625-983de54bbef7.dc1 127.0.0.1
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.411992 [INFO] serf: EventMemberJoin: Node 4c60e313-85bd-0353-1376-f5794edb2b2d 127.0.0.1
TestDNS_EDNS0 - 2019/12/30 18:56:19.412382 [INFO] serf: EventMemberJoin: Node 83a04d9a-8fca-3ee3-2625-983de54bbef7 127.0.0.1
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.413331 [INFO] agent: Started DNS server 127.0.0.1:18533 (udp)
TestDNS_EDNS0 - 2019/12/30 18:56:19.413441 [INFO] agent: Started DNS server 127.0.0.1:18539 (udp)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.413960 [INFO] consul: Adding LAN server Node 4c60e313-85bd-0353-1376-f5794edb2b2d (Addr: tcp/127.0.0.1:18538) (DC: dc1)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.414219 [INFO] consul: Handled member-join event for server "Node 4c60e313-85bd-0353-1376-f5794edb2b2d.dc1" in area "wan"
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.414856 [INFO] agent: Started DNS server 127.0.0.1:18533 (tcp)
TestDNS_EDNS0 - 2019/12/30 18:56:19.415086 [INFO] consul: Handled member-join event for server "Node 83a04d9a-8fca-3ee3-2625-983de54bbef7.dc1" in area "wan"
TestDNS_EDNS0 - 2019/12/30 18:56:19.415394 [INFO] agent: Started DNS server 127.0.0.1:18539 (tcp)
TestDNS_EDNS0 - 2019/12/30 18:56:19.416134 [INFO] consul: Adding LAN server Node 83a04d9a-8fca-3ee3-2625-983de54bbef7 (Addr: tcp/127.0.0.1:18544) (DC: dc1)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.417146 [INFO] agent: Started HTTP server on 127.0.0.1:18534 (tcp)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:19.417240 [INFO] agent: started state syncer
TestDNS_EDNS0 - 2019/12/30 18:56:19.417490 [INFO] agent: Started HTTP server on 127.0.0.1:18540 (tcp)
TestDNS_EDNS0 - 2019/12/30 18:56:19.417575 [INFO] agent: started state syncer
2019/12/30 18:56:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:19 [INFO]  raft: Node at 127.0.0.1:18538 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:19 [INFO]  raft: Node at 127.0.0.1:18544 [Candidate] entering Candidate state in term 2
TestDNS_ReverseLookup - 2019/12/30 18:56:19.785178 [INFO] agent: Synced node info
TestDNS_ReverseLookup - 2019/12/30 18:56:19.785307 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/30 18:56:20.727508 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/30 18:56:20.729533 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/30 18:56:20.736411 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 537.681µs) from client 127.0.0.1:43102 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:20.737232 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/30 18:56:20.737471 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/30 18:56:20.737688 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:20 [INFO]  raft: Node at 127.0.0.1:18538 [Leader] entering Leader state
2019/12/30 18:56:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:20 [INFO]  raft: Node at 127.0.0.1:18544 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/30 18:56:20.856653 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0_ECS - 2019/12/30 18:56:20.859769 [INFO] consul: cluster leadership acquired
TestDNS_EDNS0_ECS - 2019/12/30 18:56:20.860236 [INFO] consul: New leader elected: Node 4c60e313-85bd-0353-1376-f5794edb2b2d
TestDNS_EDNS0 - 2019/12/30 18:56:20.861044 [INFO] consul: cluster leadership acquired
TestDNS_EDNS0 - 2019/12/30 18:56:20.861657 [INFO] consul: New leader elected: Node 83a04d9a-8fca-3ee3-2625-983de54bbef7
TestDNS_ReverseLookup - 2019/12/30 18:56:20.864201 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (601.683µs) from client 127.0.0.1:57134 (udp)
TestDNS_ReverseLookup - 2019/12/30 18:56:20.864508 [INFO] agent: Requesting shutdown
TestDNS_ReverseLookup - 2019/12/30 18:56:20.865714 [INFO] consul: shutting down server
TestDNS_ReverseLookup - 2019/12/30 18:56:20.865762 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup - 2019/12/30 18:56:20.943577 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/30 18:56:20.952099 [INFO] manager: shutting down
TestDNS_ReverseLookup - 2019/12/30 18:56:21.054509 [INFO] manager: shutting down
TestDNS_ReverseLookup - 2019/12/30 18:56:21.136823 [ERR] agent: failed to sync remote state: No cluster leader
TestDNS_ReverseLookup - 2019/12/30 18:56:21.245825 [INFO] agent: consul server down
TestDNS_ReverseLookup - 2019/12/30 18:56:21.245912 [INFO] agent: shutdown complete
TestDNS_ReverseLookup - 2019/12/30 18:56:21.245968 [INFO] agent: Stopping DNS server 127.0.0.1:18527 (tcp)
TestDNS_ReverseLookup - 2019/12/30 18:56:21.246125 [INFO] agent: Stopping DNS server 127.0.0.1:18527 (udp)
TestDNS_ReverseLookup - 2019/12/30 18:56:21.246306 [INFO] agent: Stopping HTTP server 127.0.0.1:18528 (tcp)
TestDNS_ReverseLookup - 2019/12/30 18:56:21.246522 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ReverseLookup - 2019/12/30 18:56:21.246589 [INFO] agent: Endpoints down
--- PASS: TestDNS_ReverseLookup (4.13s)
=== CONT  TestCatalogNodeServices_Filter
TestDNS_ReverseLookup - 2019/12/30 18:56:21.257706 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ReverseLookup - 2019/12/30 18:56:21.257961 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ReverseLookup - 2019/12/30 18:56:21.258023 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ReverseLookup - 2019/12/30 18:56:21.258070 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ReverseLookup - 2019/12/30 18:56:21.258117 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_SOA_Settings - 2019/12/30 18:56:21.334816 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/30 18:56:21.334894 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/30 18:56:21.334945 [INFO] agent: Stopping DNS server 127.0.0.1:18521 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:21.335075 [INFO] agent: Stopping DNS server 127.0.0.1:18521 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:21.335211 [INFO] agent: Stopping HTTP server 127.0.0.1:18522 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:21.335403 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/30 18:56:21.335466 [INFO] agent: Endpoints down
TestDNS_SOA_Settings - 2019/12/30 18:56:21.354764 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodeServices_Filter - 2019/12/30 18:56:21.438149 [WARN] agent: Node name "Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodeServices_Filter - 2019/12/30 18:56:21.438915 [DEBUG] tlsutil: Update with version 1
TestCatalogNodeServices_Filter - 2019/12/30 18:56:21.444493 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/30 18:56:21.472586 [WARN] agent: Node name "Node 26f9ac81-506f-1ebb-41ac-89b07e531fb8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/30 18:56:21.472948 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/30 18:56:21.477084 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_EDNS0_ECS - 2019/12/30 18:56:21.527250 [INFO] agent: Synced node info
TestDNS_EDNS0 - 2019/12/30 18:56:21.528483 [INFO] agent: Synced node info
TestDNS_EDNS0 - 2019/12/30 18:56:22.207692 [DEBUG] dns: request for name foo.node.dc1.consul. type ANY class IN (took 521.681µs) from client 127.0.0.1:43897 (udp)
TestDNS_EDNS0 - 2019/12/30 18:56:22.207851 [INFO] agent: Requesting shutdown
TestDNS_EDNS0 - 2019/12/30 18:56:22.207914 [INFO] consul: shutting down server
TestDNS_EDNS0 - 2019/12/30 18:56:22.207978 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0 - 2019/12/30 18:56:22.294103 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0 - 2019/12/30 18:56:22.376350 [INFO] manager: shutting down
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.378175 [INFO] agent: Requesting shutdown
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.378261 [INFO] consul: shutting down server
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.378313 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0 - 2019/12/30 18:56:22.379317 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_EDNS0 - 2019/12/30 18:56:22.379575 [INFO] agent: consul server down
TestDNS_EDNS0 - 2019/12/30 18:56:22.379624 [INFO] agent: shutdown complete
TestDNS_EDNS0 - 2019/12/30 18:56:22.379681 [INFO] agent: Stopping DNS server 127.0.0.1:18539 (tcp)
TestDNS_EDNS0 - 2019/12/30 18:56:22.379815 [INFO] agent: Stopping DNS server 127.0.0.1:18539 (udp)
TestDNS_EDNS0 - 2019/12/30 18:56:22.379965 [INFO] agent: Stopping HTTP server 127.0.0.1:18540 (tcp)
TestDNS_EDNS0 - 2019/12/30 18:56:22.380165 [INFO] agent: Waiting for endpoints to shut down
TestDNS_EDNS0 - 2019/12/30 18:56:22.380235 [INFO] agent: Endpoints down
--- PASS: TestDNS_EDNS0 (4.19s)
=== CONT  TestDNS_NodeLookup_CNAME
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:22.466889 [WARN] agent: Node name "Node ae701a8d-fd93-90ac-b26f-84c0f6bde0b1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:22.467294 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:22.469534 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.476392 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:26f9ac81-506f-1ebb-41ac-89b07e531fb8 Address:127.0.0.1:18556}]
2019/12/30 18:56:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4a72a50d-69fd-d66c-8790-d6a1d1d9e88f Address:127.0.0.1:18550}]
TestDNS_SOA_Settings - 2019/12/30 18:56:22.481416 [INFO] serf: EventMemberJoin: Node 26f9ac81-506f-1ebb-41ac-89b07e531fb8.dc1 127.0.0.1
2019/12/30 18:56:22 [INFO]  raft: Node at 127.0.0.1:18550 [Follower] entering Follower state (Leader: "")
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.481475 [INFO] serf: EventMemberJoin: Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:22.485207 [INFO] serf: EventMemberJoin: Node 26f9ac81-506f-1ebb-41ac-89b07e531fb8 127.0.0.1
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.486118 [INFO] serf: EventMemberJoin: Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f 127.0.0.1
TestDNS_SOA_Settings - 2019/12/30 18:56:22.486683 [INFO] agent: Started DNS server 127.0.0.1:18551 (udp)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.487242 [INFO] agent: Started DNS server 127.0.0.1:18545 (udp)
2019/12/30 18:56:22 [INFO]  raft: Node at 127.0.0.1:18556 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/30 18:56:22.488261 [INFO] consul: Adding LAN server Node 26f9ac81-506f-1ebb-41ac-89b07e531fb8 (Addr: tcp/127.0.0.1:18556) (DC: dc1)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.488799 [INFO] consul: Adding LAN server Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f (Addr: tcp/127.0.0.1:18550) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/30 18:56:22.490665 [INFO] agent: Started DNS server 127.0.0.1:18551 (tcp)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.488946 [INFO] consul: Handled member-join event for server "Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f.dc1" in area "wan"
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.489611 [INFO] agent: Started DNS server 127.0.0.1:18545 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:22.490109 [INFO] consul: Handled member-join event for server "Node 26f9ac81-506f-1ebb-41ac-89b07e531fb8.dc1" in area "wan"
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.493366 [INFO] agent: Started HTTP server on 127.0.0.1:18546 (tcp)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:22.493457 [INFO] agent: started state syncer
TestDNS_SOA_Settings - 2019/12/30 18:56:22.495110 [INFO] agent: Started HTTP server on 127.0.0.1:18552 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:22.495222 [INFO] agent: started state syncer
2019/12/30 18:56:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:22 [INFO]  raft: Node at 127.0.0.1:18550 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:22 [INFO]  raft: Node at 127.0.0.1:18556 [Candidate] entering Candidate state in term 2
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.589628 [INFO] manager: shutting down
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590087 [INFO] agent: consul server down
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590143 [INFO] agent: shutdown complete
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590196 [INFO] agent: Stopping DNS server 127.0.0.1:18533 (tcp)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590319 [INFO] agent: Stopping DNS server 127.0.0.1:18533 (udp)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590464 [INFO] agent: Stopping HTTP server 127.0.0.1:18534 (tcp)
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590664 [INFO] agent: Waiting for endpoints to shut down
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.590747 [INFO] agent: Endpoints down
--- PASS: TestDNS_EDNS0_ECS (4.40s)
=== CONT  TestDNSCycleRecursorCheck
TestDNS_EDNS0_ECS - 2019/12/30 18:56:22.621778 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNSCycleRecursorCheck - 2019/12/30 18:56:22.688082 [WARN] agent: Node name "Node c5b6893c-0dab-182e-7320-bc5373f388d8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNSCycleRecursorCheck - 2019/12/30 18:56:22.688672 [DEBUG] tlsutil: Update with version 1
TestDNSCycleRecursorCheck - 2019/12/30 18:56:22.691275 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:23 [INFO]  raft: Node at 127.0.0.1:18550 [Leader] entering Leader state
TestCatalogNodeServices_Filter - 2019/12/30 18:56:23.253246 [INFO] consul: cluster leadership acquired
TestCatalogNodeServices_Filter - 2019/12/30 18:56:23.253771 [INFO] consul: New leader elected: Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f
2019/12/30 18:56:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:23 [INFO]  raft: Node at 127.0.0.1:18556 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/30 18:56:23.343387 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/30 18:56:23.343863 [INFO] consul: New leader elected: Node 26f9ac81-506f-1ebb-41ac-89b07e531fb8
2019/12/30 18:56:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ae701a8d-fd93-90ac-b26f-84c0f6bde0b1 Address:127.0.0.1:18562}]
TestCatalogNodeServices_Filter - 2019/12/30 18:56:23.660750 [INFO] agent: Synced node info
2019/12/30 18:56:23 [INFO]  raft: Node at 127.0.0.1:18562 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.665618 [INFO] serf: EventMemberJoin: Node ae701a8d-fd93-90ac-b26f-84c0f6bde0b1.dc1 127.0.0.1
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.675624 [INFO] serf: EventMemberJoin: Node ae701a8d-fd93-90ac-b26f-84c0f6bde0b1 127.0.0.1
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.678611 [INFO] consul: Adding LAN server Node ae701a8d-fd93-90ac-b26f-84c0f6bde0b1 (Addr: tcp/127.0.0.1:18562) (DC: dc1)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.679956 [INFO] consul: Handled member-join event for server "Node ae701a8d-fd93-90ac-b26f-84c0f6bde0b1.dc1" in area "wan"
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.682862 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.683327 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.683922 [INFO] agent: Started DNS server 127.0.0.1:18557 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.684004 [INFO] agent: Started DNS server 127.0.0.1:18557 (udp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.693084 [INFO] agent: Started HTTP server on 127.0.0.1:18558 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:23.693191 [INFO] agent: started state syncer
2019/12/30 18:56:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:23 [INFO]  raft: Node at 127.0.0.1:18562 [Candidate] entering Candidate state in term 2
TestDNS_SOA_Settings - 2019/12/30 18:56:23.827128 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/30 18:56:23.852767 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 513.68µs) from client 127.0.0.1:56036 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:23.853145 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/30 18:56:23.853231 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/30 18:56:23.853281 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/30 18:56:24.001215 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c5b6893c-0dab-182e-7320-bc5373f388d8 Address:127.0.0.1:18568}]
2019/12/30 18:56:24 [INFO]  raft: Node at 127.0.0.1:18568 [Follower] entering Follower state (Leader: "")
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.007211 [INFO] serf: EventMemberJoin: Node c5b6893c-0dab-182e-7320-bc5373f388d8.dc1 127.0.0.1
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.011843 [INFO] serf: EventMemberJoin: Node c5b6893c-0dab-182e-7320-bc5373f388d8 127.0.0.1
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.013016 [INFO] consul: Adding LAN server Node c5b6893c-0dab-182e-7320-bc5373f388d8 (Addr: tcp/127.0.0.1:18568) (DC: dc1)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.013615 [INFO] consul: Handled member-join event for server "Node c5b6893c-0dab-182e-7320-bc5373f388d8.dc1" in area "wan"
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.014717 [DEBUG] dns: recursor enabled
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.015200 [INFO] agent: Started DNS server 127.0.0.1:18563 (tcp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.015274 [DEBUG] dns: recursor enabled
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.015820 [INFO] agent: Started DNS server 127.0.0.1:18563 (udp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.018116 [INFO] agent: Started HTTP server on 127.0.0.1:18564 (tcp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.018365 [INFO] agent: started state syncer
2019/12/30 18:56:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:24 [INFO]  raft: Node at 127.0.0.1:18568 [Candidate] entering Candidate state in term 2
TestCatalogNodeServices_Filter - 2019/12/30 18:56:24.077093 [DEBUG] agent: Node info in sync
TestCatalogNodeServices_Filter - 2019/12/30 18:56:24.077227 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/30 18:56:24.120245 [INFO] manager: shutting down
TestDNS_SOA_Settings - 2019/12/30 18:56:24.392940 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_SOA_Settings - 2019/12/30 18:56:24.393278 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/30 18:56:24.393343 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/30 18:56:24.393420 [INFO] agent: Stopping DNS server 127.0.0.1:18551 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:24.393679 [INFO] agent: Stopping DNS server 127.0.0.1:18551 (udp)
TestDNS_SOA_Settings - 2019/12/30 18:56:24.393898 [INFO] agent: Stopping HTTP server 127.0.0.1:18552 (tcp)
TestDNS_SOA_Settings - 2019/12/30 18:56:24.394166 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/30 18:56:24.394247 [INFO] agent: Endpoints down
--- PASS: TestDNS_SOA_Settings (14.26s)
=== CONT  TestDNS_NodeLookup_AAAA
2019/12/30 18:56:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:24 [INFO]  raft: Node at 127.0.0.1:18562 [Leader] entering Leader state
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:24.497137 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:24.497596 [INFO] consul: New leader elected: Node ae701a8d-fd93-90ac-b26f-84c0f6bde0b1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:24.520357 [WARN] agent: Node name "Node 3c27dd2a-47b1-baa7-5e2b-bb31ecb42276" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:24.520759 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:24.522831 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:24 [INFO]  raft: Node at 127.0.0.1:18568 [Leader] entering Leader state
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.786918 [INFO] consul: cluster leadership acquired
TestDNSCycleRecursorCheck - 2019/12/30 18:56:24.787329 [INFO] consul: New leader elected: Node c5b6893c-0dab-182e-7320-bc5373f388d8
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:24.996358 [INFO] agent: Synced node info
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.000121 [DEBUG] agent: Node info in sync
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.002361 [DEBUG] dns: recurse RTT for {google.com. 1 1} (455.679µs) Recursor queried: 127.0.0.1:44926 Status returned: SERVFAIL
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.003276 [DEBUG] dns: recurse RTT for {google.com. 1 1} (464.012µs) Recursor queried: 127.0.0.1:51071
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.003521 [DEBUG] dns: request for {google.com. 1 1} (udp) (2.105056ms) from client 127.0.0.1:43603 (udp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.003689 [INFO] agent: Requesting shutdown
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.003769 [INFO] consul: shutting down server
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.003813 [WARN] serf: Shutdown without a Leave
TestCatalogNodeServices_Filter - 2019/12/30 18:56:25.093834 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodeServices_Filter - 2019/12/30 18:56:25.096129 [DEBUG] consul: Skipping self join check for "Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f" since the cluster is too small
TestCatalogNodeServices_Filter - 2019/12/30 18:56:25.096329 [INFO] consul: member 'Node 4a72a50d-69fd-d66c-8790-d6a1d1d9e88f' joined, marking health alive
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.098821 [WARN] serf: Shutdown without a Leave
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.177227 [INFO] agent: Synced node info
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.177410 [DEBUG] agent: Node info in sync
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.178565 [INFO] manager: shutting down
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.351652 [INFO] agent: consul server down
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.351747 [INFO] agent: shutdown complete
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.351824 [INFO] agent: Stopping DNS server 127.0.0.1:18563 (tcp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.352001 [INFO] agent: Stopping DNS server 127.0.0.1:18563 (udp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.352207 [INFO] agent: Stopping HTTP server 127.0.0.1:18564 (tcp)
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.352508 [INFO] agent: Waiting for endpoints to shut down
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.352596 [INFO] agent: Endpoints down
--- PASS: TestDNSCycleRecursorCheck (2.76s)
=== CONT  TestDNS_NodeLookup_PeriodName
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.358113 [DEBUG] dns: cname recurse RTT for www.google.com. (935.358µs)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.358536 [DEBUG] dns: request for name google.node.consul. type ANY class IN (took 2.03572ms) from client 127.0.0.1:54992 (udp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.358946 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.359102 [INFO] consul: shutting down server
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.359232 [WARN] serf: Shutdown without a Leave
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.368305 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNSCycleRecursorCheck - 2019/12/30 18:56:25.368604 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:25.408906 [WARN] agent: Node name "Node 7f055704-279b-d03f-7b2f-3f1900a144e6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:25.409345 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:25.411575 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.471024 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.627386 [INFO] manager: shutting down
2019/12/30 18:56:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3c27dd2a-47b1-baa7-5e2b-bb31ecb42276 Address:127.0.0.1:18574}]
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.633509 [INFO] serf: EventMemberJoin: Node 3c27dd2a-47b1-baa7-5e2b-bb31ecb42276.dc1 127.0.0.1
2019/12/30 18:56:25 [INFO]  raft: Node at 127.0.0.1:18574 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.643838 [INFO] serf: EventMemberJoin: Node 3c27dd2a-47b1-baa7-5e2b-bb31ecb42276 127.0.0.1
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.645490 [INFO] agent: Started DNS server 127.0.0.1:18569 (udp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.646280 [INFO] consul: Adding LAN server Node 3c27dd2a-47b1-baa7-5e2b-bb31ecb42276 (Addr: tcp/127.0.0.1:18574) (DC: dc1)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.646755 [INFO] consul: Handled member-join event for server "Node 3c27dd2a-47b1-baa7-5e2b-bb31ecb42276.dc1" in area "wan"
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.646897 [INFO] agent: Started DNS server 127.0.0.1:18569 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.650273 [INFO] agent: Started HTTP server on 127.0.0.1:18570 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:25.650361 [INFO] agent: started state syncer
2019/12/30 18:56:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:25 [INFO]  raft: Node at 127.0.0.1:18574 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.745498 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.745752 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.745843 [INFO] agent: consul server down
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.745887 [INFO] agent: shutdown complete
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.745970 [INFO] agent: Stopping DNS server 127.0.0.1:18557 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.746126 [INFO] agent: Stopping DNS server 127.0.0.1:18557 (udp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.746307 [INFO] agent: Stopping HTTP server 127.0.0.1:18558 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.746508 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_CNAME - 2019/12/30 18:56:25.746579 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_CNAME (3.37s)
=== CONT  TestDNS_CaseInsensitiveNodeLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:25.823259 [WARN] agent: Node name "Node 014e75ac-f3d7-6267-e2f9-0878c1da10f0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:25.824151 [DEBUG] tlsutil: Update with version 1
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:25.827019 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodeServices_Filter - 2019/12/30 18:56:25.947427 [INFO] agent: Requesting shutdown
TestCatalogNodeServices_Filter - 2019/12/30 18:56:25.947542 [INFO] consul: shutting down server
TestCatalogNodeServices_Filter - 2019/12/30 18:56:25.947599 [WARN] serf: Shutdown without a Leave
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.026378 [WARN] serf: Shutdown without a Leave
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.159981 [INFO] manager: shutting down
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.160608 [INFO] agent: consul server down
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.160686 [INFO] agent: shutdown complete
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.160780 [INFO] agent: Stopping DNS server 127.0.0.1:18545 (tcp)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.160946 [INFO] agent: Stopping DNS server 127.0.0.1:18545 (udp)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.161218 [INFO] agent: Stopping HTTP server 127.0.0.1:18546 (tcp)
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.161509 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodeServices_Filter - 2019/12/30 18:56:26.161618 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodeServices_Filter (4.91s)
=== CONT  TestDNS_Over_TCP
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Over_TCP - 2019/12/30 18:56:26.223736 [WARN] agent: Node name "Node 3b1f8b48-6848-ac56-ce5c-f1104968489c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Over_TCP - 2019/12/30 18:56:26.224291 [DEBUG] tlsutil: Update with version 1
TestDNS_Over_TCP - 2019/12/30 18:56:26.227230 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:26 [INFO]  raft: Node at 127.0.0.1:18574 [Leader] entering Leader state
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:26.335578 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:26.336109 [INFO] consul: New leader elected: Node 3c27dd2a-47b1-baa7-5e2b-bb31ecb42276
2019/12/30 18:56:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7f055704-279b-d03f-7b2f-3f1900a144e6 Address:127.0.0.1:18580}]
2019/12/30 18:56:26 [INFO]  raft: Node at 127.0.0.1:18580 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.522286 [INFO] serf: EventMemberJoin: Node 7f055704-279b-d03f-7b2f-3f1900a144e6.dc1 127.0.0.1
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.526026 [INFO] serf: EventMemberJoin: Node 7f055704-279b-d03f-7b2f-3f1900a144e6 127.0.0.1
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.526812 [INFO] consul: Adding LAN server Node 7f055704-279b-d03f-7b2f-3f1900a144e6 (Addr: tcp/127.0.0.1:18580) (DC: dc1)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.527401 [INFO] consul: Handled member-join event for server "Node 7f055704-279b-d03f-7b2f-3f1900a144e6.dc1" in area "wan"
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.528610 [INFO] agent: Started DNS server 127.0.0.1:18575 (udp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.528905 [INFO] agent: Started DNS server 127.0.0.1:18575 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.531253 [INFO] agent: Started HTTP server on 127.0.0.1:18576 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:26.531357 [INFO] agent: started state syncer
2019/12/30 18:56:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:26 [INFO]  raft: Node at 127.0.0.1:18580 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:26.802372 [INFO] agent: Synced node info
2019/12/30 18:56:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:014e75ac-f3d7-6267-e2f9-0878c1da10f0 Address:127.0.0.1:18586}]
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.908834 [INFO] serf: EventMemberJoin: Node 014e75ac-f3d7-6267-e2f9-0878c1da10f0.dc1 127.0.0.1
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.913153 [INFO] serf: EventMemberJoin: Node 014e75ac-f3d7-6267-e2f9-0878c1da10f0 127.0.0.1
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.914597 [INFO] agent: Started DNS server 127.0.0.1:18581 (udp)
2019/12/30 18:56:26 [INFO]  raft: Node at 127.0.0.1:18586 [Follower] entering Follower state (Leader: "")
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.916469 [INFO] consul: Adding LAN server Node 014e75ac-f3d7-6267-e2f9-0878c1da10f0 (Addr: tcp/127.0.0.1:18586) (DC: dc1)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.916734 [INFO] consul: Handled member-join event for server "Node 014e75ac-f3d7-6267-e2f9-0878c1da10f0.dc1" in area "wan"
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.917251 [INFO] agent: Started DNS server 127.0.0.1:18581 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.919610 [INFO] agent: Started HTTP server on 127.0.0.1:18582 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:26.919734 [INFO] agent: started state syncer
2019/12/30 18:56:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:26 [INFO]  raft: Node at 127.0.0.1:18586 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:27 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:27 [INFO]  raft: Node at 127.0.0.1:18580 [Leader] entering Leader state
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:27.339012 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:27.339538 [INFO] consul: New leader elected: Node 7f055704-279b-d03f-7b2f-3f1900a144e6
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.343702 [DEBUG] dns: request for name bar.node.consul. type AAAA class IN (took 522.014µs) from client 127.0.0.1:41992 (udp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.344100 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.344167 [INFO] consul: shutting down server
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.344211 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3b1f8b48-6848-ac56-ce5c-f1104968489c Address:127.0.0.1:18592}]
2019/12/30 18:56:27 [INFO]  raft: Node at 127.0.0.1:18592 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.421426 [WARN] serf: Shutdown without a Leave
TestDNS_Over_TCP - 2019/12/30 18:56:27.427082 [INFO] serf: EventMemberJoin: Node 3b1f8b48-6848-ac56-ce5c-f1104968489c.dc1 127.0.0.1
TestDNS_Over_TCP - 2019/12/30 18:56:27.432847 [INFO] serf: EventMemberJoin: Node 3b1f8b48-6848-ac56-ce5c-f1104968489c 127.0.0.1
TestDNS_Over_TCP - 2019/12/30 18:56:27.433562 [INFO] consul: Adding LAN server Node 3b1f8b48-6848-ac56-ce5c-f1104968489c (Addr: tcp/127.0.0.1:18592) (DC: dc1)
TestDNS_Over_TCP - 2019/12/30 18:56:27.434142 [INFO] consul: Handled member-join event for server "Node 3b1f8b48-6848-ac56-ce5c-f1104968489c.dc1" in area "wan"
TestDNS_Over_TCP - 2019/12/30 18:56:27.436292 [INFO] agent: Started DNS server 127.0.0.1:18587 (tcp)
TestDNS_Over_TCP - 2019/12/30 18:56:27.436459 [INFO] agent: Started DNS server 127.0.0.1:18587 (udp)
TestDNS_Over_TCP - 2019/12/30 18:56:27.441511 [INFO] agent: Started HTTP server on 127.0.0.1:18588 (tcp)
TestDNS_Over_TCP - 2019/12/30 18:56:27.441740 [INFO] agent: started state syncer
2019/12/30 18:56:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:27 [INFO]  raft: Node at 127.0.0.1:18592 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.527330 [INFO] manager: shutting down
2019/12/30 18:56:27 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:27 [INFO]  raft: Node at 127.0.0.1:18586 [Leader] entering Leader state
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:27.596722 [INFO] consul: cluster leadership acquired
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:27.597240 [INFO] consul: New leader elected: Node 014e75ac-f3d7-6267-e2f9-0878c1da10f0
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.597924 [INFO] agent: consul server down
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.597993 [INFO] agent: shutdown complete
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.598056 [INFO] agent: Stopping DNS server 127.0.0.1:18569 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.598202 [INFO] agent: Stopping DNS server 127.0.0.1:18569 (udp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.598369 [INFO] agent: Stopping HTTP server 127.0.0.1:18570 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.598577 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.598656 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_AAAA (3.20s)
=== CONT  TestRecursorAddr
--- PASS: TestRecursorAddr (0.00s)
=== CONT  TestCoordinate_Update_ACLDeny
TestDNS_NodeLookup_AAAA - 2019/12/30 18:56:27.604800 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:27.681714 [WARN] agent: Node name "Node a8380970-6449-748a-4040-638e855029d2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:27.682119 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:27.684317 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:27.952575 [INFO] agent: Synced node info
2019/12/30 18:56:28 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:28 [INFO]  raft: Node at 127.0.0.1:18592 [Leader] entering Leader state
TestDNS_Over_TCP - 2019/12/30 18:56:28.095776 [INFO] consul: cluster leadership acquired
TestDNS_Over_TCP - 2019/12/30 18:56:28.096326 [INFO] consul: New leader elected: Node 3b1f8b48-6848-ac56-ce5c-f1104968489c
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:28.356202 [INFO] agent: Synced node info
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.440235 [DEBUG] dns: request for name fOO.node.dc1.consul. type ANY class IN (took 468.679µs) from client 127.0.0.1:48515 (udp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.440522 [INFO] agent: Requesting shutdown
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.440590 [INFO] consul: shutting down server
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.440636 [WARN] serf: Shutdown without a Leave
TestDNS_Over_TCP - 2019/12/30 18:56:28.520521 [INFO] agent: Synced node info
TestDNS_Over_TCP - 2019/12/30 18:56:28.520648 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.524013 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.613208 [INFO] manager: shutting down
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.710151 [INFO] agent: consul server down
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.710251 [INFO] agent: shutdown complete
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.710331 [INFO] agent: Stopping DNS server 127.0.0.1:18581 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.710530 [INFO] agent: Stopping DNS server 127.0.0.1:18581 (udp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.710728 [INFO] agent: Stopping HTTP server 127.0.0.1:18582 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.710970 [INFO] agent: Waiting for endpoints to shut down
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.711059 [INFO] agent: Endpoints down
--- PASS: TestDNS_CaseInsensitiveNodeLookup (2.96s)
=== CONT  TestCoordinate_Update
2019/12/30 18:56:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a8380970-6449-748a-4040-638e855029d2 Address:127.0.0.1:18598}]
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.716621 [INFO] serf: EventMemberJoin: Node a8380970-6449-748a-4040-638e855029d2.dc1 127.0.0.1
2019/12/30 18:56:28 [INFO]  raft: Node at 127.0.0.1:18598 [Follower] entering Follower state (Leader: "")
TestDNS_CaseInsensitiveNodeLookup - 2019/12/30 18:56:28.720296 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.726839 [INFO] serf: EventMemberJoin: Node a8380970-6449-748a-4040-638e855029d2 127.0.0.1
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.728724 [INFO] consul: Adding LAN server Node a8380970-6449-748a-4040-638e855029d2 (Addr: tcp/127.0.0.1:18598) (DC: dc1)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.729556 [INFO] consul: Handled member-join event for server "Node a8380970-6449-748a-4040-638e855029d2.dc1" in area "wan"
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.732551 [INFO] agent: Started DNS server 127.0.0.1:18593 (tcp)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.732746 [INFO] agent: Started DNS server 127.0.0.1:18593 (udp)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.735454 [INFO] agent: Started HTTP server on 127.0.0.1:18594 (tcp)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:28.736045 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Update - 2019/12/30 18:56:28.775400 [WARN] agent: Node name "Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Update - 2019/12/30 18:56:28.775935 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Update - 2019/12/30 18:56:28.778131 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:28 [INFO]  raft: Node at 127.0.0.1:18598 [Candidate] entering Candidate state in term 2
TestDNS_Over_TCP - 2019/12/30 18:56:28.879877 [DEBUG] dns: request for name foo.node.dc1.consul. type ANY class IN (took 619.35µs) from client 127.0.0.1:49438 (tcp)
TestDNS_Over_TCP - 2019/12/30 18:56:28.880244 [INFO] agent: Requesting shutdown
TestDNS_Over_TCP - 2019/12/30 18:56:28.880322 [INFO] consul: shutting down server
TestDNS_Over_TCP - 2019/12/30 18:56:28.880373 [WARN] serf: Shutdown without a Leave
TestDNS_Over_TCP - 2019/12/30 18:56:28.978348 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:28.982206 [DEBUG] dns: request for name foo.bar.node.consul. type ANY class IN (took 746.353µs) from client 127.0.0.1:59292 (udp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:28.982489 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:28.982563 [INFO] consul: shutting down server
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:28.982609 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.085893 [WARN] serf: Shutdown without a Leave
TestDNS_Over_TCP - 2019/12/30 18:56:29.086939 [INFO] manager: shutting down
TestDNS_Over_TCP - 2019/12/30 18:56:29.168504 [INFO] agent: consul server down
TestDNS_Over_TCP - 2019/12/30 18:56:29.168600 [INFO] agent: shutdown complete
TestDNS_Over_TCP - 2019/12/30 18:56:29.168678 [INFO] agent: Stopping DNS server 127.0.0.1:18587 (tcp)
TestDNS_Over_TCP - 2019/12/30 18:56:29.168854 [INFO] agent: Stopping DNS server 127.0.0.1:18587 (udp)
TestDNS_Over_TCP - 2019/12/30 18:56:29.169062 [INFO] agent: Stopping HTTP server 127.0.0.1:18588 (tcp)
TestDNS_Over_TCP - 2019/12/30 18:56:29.169361 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Over_TCP - 2019/12/30 18:56:29.169526 [INFO] agent: Endpoints down
--- PASS: TestDNS_Over_TCP (3.01s)
=== CONT  TestCoordinate_Node
TestDNS_Over_TCP - 2019/12/30 18:56:29.171312 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Over_TCP - 2019/12/30 18:56:29.171677 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.174234 [INFO] manager: shutting down
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.185397 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.185485 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186224 [INFO] agent: consul server down
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186285 [INFO] agent: shutdown complete
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186355 [INFO] agent: Stopping DNS server 127.0.0.1:18575 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186505 [INFO] agent: Stopping DNS server 127.0.0.1:18575 (udp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186658 [INFO] agent: Stopping HTTP server 127.0.0.1:18576 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186847 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_PeriodName - 2019/12/30 18:56:29.186916 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_PeriodName (3.83s)
=== CONT  TestCoordinate_Nodes
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Node - 2019/12/30 18:56:29.232575 [WARN] agent: Node name "Node f311b591-2e28-4bba-c110-b32511cb2a61" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Node - 2019/12/30 18:56:29.232995 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Node - 2019/12/30 18:56:29.235341 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Nodes - 2019/12/30 18:56:29.276890 [WARN] agent: Node name "Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Nodes - 2019/12/30 18:56:29.277298 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Nodes - 2019/12/30 18:56:29.279522 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:29 [INFO]  raft: Node at 127.0.0.1:18598 [Leader] entering Leader state
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.427635 [INFO] consul: cluster leadership acquired
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.428074 [INFO] consul: New leader elected: Node a8380970-6449-748a-4040-638e855029d2
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.488458 [INFO] acl: initializing acls
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.690549 [ERR] agent: failed to sync remote state: ACL not found
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.836718 [INFO] consul: Created ACL 'global-management' policy
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.836829 [WARN] consul: Configuring a non-UUID master token is deprecated
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.840216 [INFO] acl: initializing acls
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:29.840362 [WARN] consul: Configuring a non-UUID master token is deprecated
2019/12/30 18:56:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:68ce883f-7d95-52c5-4852-4c37af5c6c3d Address:127.0.0.1:18604}]
TestCoordinate_Update - 2019/12/30 18:56:29.932865 [INFO] serf: EventMemberJoin: Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d.dc1 127.0.0.1
2019/12/30 18:56:29 [INFO]  raft: Node at 127.0.0.1:18604 [Follower] entering Follower state (Leader: "")
TestCoordinate_Update - 2019/12/30 18:56:29.936858 [INFO] serf: EventMemberJoin: Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d 127.0.0.1
TestCoordinate_Update - 2019/12/30 18:56:29.937664 [INFO] consul: Handled member-join event for server "Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d.dc1" in area "wan"
TestCoordinate_Update - 2019/12/30 18:56:29.937994 [INFO] consul: Adding LAN server Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d (Addr: tcp/127.0.0.1:18604) (DC: dc1)
TestCoordinate_Update - 2019/12/30 18:56:29.938211 [INFO] agent: Started DNS server 127.0.0.1:18599 (udp)
TestCoordinate_Update - 2019/12/30 18:56:29.938635 [INFO] agent: Started DNS server 127.0.0.1:18599 (tcp)
TestCoordinate_Update - 2019/12/30 18:56:29.940948 [INFO] agent: Started HTTP server on 127.0.0.1:18600 (tcp)
TestCoordinate_Update - 2019/12/30 18:56:29.941067 [INFO] agent: started state syncer
2019/12/30 18:56:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18604 [Candidate] entering Candidate state in term 2
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.021490 [INFO] consul: Bootstrapped ACL master token from configuration
2019/12/30 18:56:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb Address:127.0.0.1:18616}]
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18616 [Follower] entering Follower state (Leader: "")
TestCoordinate_Nodes - 2019/12/30 18:56:30.280651 [INFO] serf: EventMemberJoin: Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb.dc1 127.0.0.1
2019/12/30 18:56:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f311b591-2e28-4bba-c110-b32511cb2a61 Address:127.0.0.1:18610}]
TestCoordinate_Nodes - 2019/12/30 18:56:30.288268 [INFO] serf: EventMemberJoin: Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb 127.0.0.1
TestCoordinate_Node - 2019/12/30 18:56:30.292048 [INFO] serf: EventMemberJoin: Node f311b591-2e28-4bba-c110-b32511cb2a61.dc1 127.0.0.1
TestCoordinate_Nodes - 2019/12/30 18:56:30.292871 [INFO] consul: Adding LAN server Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb (Addr: tcp/127.0.0.1:18616) (DC: dc1)
TestCoordinate_Nodes - 2019/12/30 18:56:30.293574 [INFO] consul: Handled member-join event for server "Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb.dc1" in area "wan"
TestCoordinate_Nodes - 2019/12/30 18:56:30.295447 [INFO] agent: Started DNS server 127.0.0.1:18611 (udp)
TestCoordinate_Nodes - 2019/12/30 18:56:30.295533 [INFO] agent: Started DNS server 127.0.0.1:18611 (tcp)
TestCoordinate_Nodes - 2019/12/30 18:56:30.298031 [INFO] agent: Started HTTP server on 127.0.0.1:18612 (tcp)
TestCoordinate_Nodes - 2019/12/30 18:56:30.298132 [INFO] agent: started state syncer
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18610 [Follower] entering Follower state (Leader: "")
TestCoordinate_Node - 2019/12/30 18:56:30.308826 [INFO] serf: EventMemberJoin: Node f311b591-2e28-4bba-c110-b32511cb2a61 127.0.0.1
TestCoordinate_Node - 2019/12/30 18:56:30.310005 [INFO] consul: Adding LAN server Node f311b591-2e28-4bba-c110-b32511cb2a61 (Addr: tcp/127.0.0.1:18610) (DC: dc1)
TestCoordinate_Node - 2019/12/30 18:56:30.310401 [INFO] consul: Handled member-join event for server "Node f311b591-2e28-4bba-c110-b32511cb2a61.dc1" in area "wan"
TestCoordinate_Node - 2019/12/30 18:56:30.312514 [INFO] agent: Started DNS server 127.0.0.1:18605 (tcp)
TestCoordinate_Node - 2019/12/30 18:56:30.312933 [INFO] agent: Started DNS server 127.0.0.1:18605 (udp)
2019/12/30 18:56:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18616 [Candidate] entering Candidate state in term 2
TestCoordinate_Node - 2019/12/30 18:56:30.318158 [INFO] agent: Started HTTP server on 127.0.0.1:18606 (tcp)
TestCoordinate_Node - 2019/12/30 18:56:30.318265 [INFO] agent: started state syncer
2019/12/30 18:56:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18610 [Candidate] entering Candidate state in term 2
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.513433 [INFO] consul: Created ACL anonymous token from configuration
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.514806 [DEBUG] acl: transitioning out of legacy ACL mode
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.517334 [INFO] serf: EventMemberUpdate: Node a8380970-6449-748a-4040-638e855029d2
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.514692 [INFO] consul: Bootstrapped ACL master token from configuration
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.518437 [INFO] serf: EventMemberUpdate: Node a8380970-6449-748a-4040-638e855029d2.dc1
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.519665 [INFO] serf: EventMemberUpdate: Node a8380970-6449-748a-4040-638e855029d2
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:30.521351 [INFO] serf: EventMemberUpdate: Node a8380970-6449-748a-4040-638e855029d2.dc1
2019/12/30 18:56:30 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18604 [Leader] entering Leader state
TestCoordinate_Update - 2019/12/30 18:56:30.619589 [INFO] consul: cluster leadership acquired
TestCoordinate_Update - 2019/12/30 18:56:30.620051 [INFO] consul: New leader elected: Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d
2019/12/30 18:56:30 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18616 [Leader] entering Leader state
2019/12/30 18:56:30 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:30 [INFO]  raft: Node at 127.0.0.1:18610 [Leader] entering Leader state
TestCoordinate_Nodes - 2019/12/30 18:56:30.874197 [INFO] consul: cluster leadership acquired
TestCoordinate_Nodes - 2019/12/30 18:56:30.874747 [INFO] consul: New leader elected: Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb
TestCoordinate_Node - 2019/12/30 18:56:30.876301 [INFO] consul: cluster leadership acquired
TestCoordinate_Node - 2019/12/30 18:56:30.876660 [INFO] consul: New leader elected: Node f311b591-2e28-4bba-c110-b32511cb2a61
TestCoordinate_Update - 2019/12/30 18:56:30.952568 [INFO] agent: Synced node info
TestCoordinate_Update - 2019/12/30 18:56:30.952705 [DEBUG] agent: Node info in sync
TestCoordinate_Nodes - 2019/12/30 18:56:31.370331 [INFO] agent: Synced node info
TestCoordinate_Nodes - 2019/12/30 18:56:31.370474 [DEBUG] agent: Node info in sync
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.710468 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.710965 [DEBUG] consul: Skipping self join check for "Node a8380970-6449-748a-4040-638e855029d2" since the cluster is too small
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.711070 [INFO] consul: member 'Node a8380970-6449-748a-4040-638e855029d2' joined, marking health alive
TestCoordinate_Node - 2019/12/30 18:56:31.789862 [INFO] agent: Synced node info
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.874465 [DEBUG] consul: Skipping self join check for "Node a8380970-6449-748a-4040-638e855029d2" since the cluster is too small
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.874973 [DEBUG] consul: Skipping self join check for "Node a8380970-6449-748a-4040-638e855029d2" since the cluster is too small
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.890861 [DEBUG] consul: dropping node "Node a8380970-6449-748a-4040-638e855029d2" from result due to ACLs
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.891056 [INFO] agent: Requesting shutdown
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.891125 [INFO] consul: shutting down server
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:31.891173 [WARN] serf: Shutdown without a Leave
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.018029 [WARN] serf: Shutdown without a Leave
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.080549 [INFO] manager: shutting down
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.080989 [INFO] agent: consul server down
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.081047 [INFO] agent: shutdown complete
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.081155 [INFO] agent: Stopping DNS server 127.0.0.1:18593 (tcp)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.081315 [INFO] agent: Stopping DNS server 127.0.0.1:18593 (udp)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.081477 [INFO] agent: Stopping HTTP server 127.0.0.1:18594 (tcp)
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.081671 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Update_ACLDeny - 2019/12/30 18:56:32.081733 [INFO] agent: Endpoints down
--- PASS: TestCoordinate_Update_ACLDeny (4.48s)
=== CONT  TestCoordinate_Disabled_Response
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Disabled_Response - 2019/12/30 18:56:32.181111 [WARN] agent: Node name "Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Disabled_Response - 2019/12/30 18:56:32.182122 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Disabled_Response - 2019/12/30 18:56:32.186708 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Update - 2019/12/30 18:56:32.285297 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Update - 2019/12/30 18:56:32.285763 [DEBUG] consul: Skipping self join check for "Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d" since the cluster is too small
TestCoordinate_Update - 2019/12/30 18:56:32.285908 [INFO] consul: member 'Node 68ce883f-7d95-52c5-4852-4c37af5c6c3d' joined, marking health alive
TestCoordinate_Node - 2019/12/30 18:56:32.313392 [DEBUG] agent: Node info in sync
TestCoordinate_Node - 2019/12/30 18:56:32.313515 [DEBUG] agent: Node info in sync
TestCoordinate_Update - 2019/12/30 18:56:32.655936 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestCoordinate_Update - 2019/12/30 18:56:32.656036 [DEBUG] agent: Node info in sync
TestCoordinate_Nodes - 2019/12/30 18:56:32.664141 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Nodes - 2019/12/30 18:56:32.664702 [DEBUG] consul: Skipping self join check for "Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb" since the cluster is too small
TestCoordinate_Nodes - 2019/12/30 18:56:32.664854 [INFO] consul: member 'Node a15ce3dc-4e8a-4be8-6dbf-f7dcc73bfdcb' joined, marking health alive
TestCoordinate_Update - 2019/12/30 18:56:32.965984 [INFO] agent: Requesting shutdown
TestCoordinate_Update - 2019/12/30 18:56:32.966080 [INFO] consul: shutting down server
TestCoordinate_Update - 2019/12/30 18:56:32.966127 [WARN] serf: Shutdown without a Leave
TestCoordinate_Node - 2019/12/30 18:56:33.018857 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Node - 2019/12/30 18:56:33.019295 [DEBUG] consul: Skipping self join check for "Node f311b591-2e28-4bba-c110-b32511cb2a61" since the cluster is too small
TestCoordinate_Node - 2019/12/30 18:56:33.019513 [INFO] consul: member 'Node f311b591-2e28-4bba-c110-b32511cb2a61' joined, marking health alive
TestCoordinate_Update - 2019/12/30 18:56:33.111878 [WARN] serf: Shutdown without a Leave
TestCoordinate_Update - 2019/12/30 18:56:33.195483 [INFO] manager: shutting down
TestCoordinate_Update - 2019/12/30 18:56:33.195920 [INFO] agent: consul server down
TestCoordinate_Update - 2019/12/30 18:56:33.195971 [INFO] agent: shutdown complete
TestCoordinate_Update - 2019/12/30 18:56:33.196019 [INFO] agent: Stopping DNS server 127.0.0.1:18599 (tcp)
TestCoordinate_Update - 2019/12/30 18:56:33.196145 [INFO] agent: Stopping DNS server 127.0.0.1:18599 (udp)
TestCoordinate_Update - 2019/12/30 18:56:33.196290 [INFO] agent: Stopping HTTP server 127.0.0.1:18600 (tcp)
TestCoordinate_Update - 2019/12/30 18:56:33.196475 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Update - 2019/12/30 18:56:33.196537 [INFO] agent: Endpoints down
--- PASS: TestCoordinate_Update (4.49s)
=== CONT  TestConnectCAConfig
2019/12/30 18:56:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9d29210b-1f1c-0005-51d3-8f47dbbd2863 Address:127.0.0.1:18622}]
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.272080 [INFO] serf: EventMemberJoin: Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863.dc1 127.0.0.1
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.275938 [INFO] serf: EventMemberJoin: Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863 127.0.0.1
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.277353 [INFO] agent: Started DNS server 127.0.0.1:18617 (udp)
2019/12/30 18:56:33 [INFO]  raft: Node at 127.0.0.1:18622 [Follower] entering Follower state (Leader: "")
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.279613 [INFO] consul: Adding LAN server Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863 (Addr: tcp/127.0.0.1:18622) (DC: dc1)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.280046 [INFO] consul: Handled member-join event for server "Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863.dc1" in area "wan"
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.280634 [INFO] agent: Started DNS server 127.0.0.1:18617 (tcp)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.283812 [INFO] agent: Started HTTP server on 127.0.0.1:18618 (tcp)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.283919 [INFO] agent: started state syncer
2019/12/30 18:56:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:33 [INFO]  raft: Node at 127.0.0.1:18622 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCAConfig - 2019/12/30 18:56:33.341020 [WARN] agent: Node name "Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCAConfig - 2019/12/30 18:56:33.341471 [DEBUG] tlsutil: Update with version 1
TestConnectCAConfig - 2019/12/30 18:56:33.352921 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Nodes - 2019/12/30 18:56:33.440657 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestCoordinate_Nodes - 2019/12/30 18:56:33.440750 [DEBUG] agent: Node info in sync
TestCoordinate_Nodes - 2019/12/30 18:56:33.644575 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Nodes - 2019/12/30 18:56:33.679054 [INFO] agent: Requesting shutdown
TestCoordinate_Nodes - 2019/12/30 18:56:33.679153 [INFO] consul: shutting down server
TestCoordinate_Nodes - 2019/12/30 18:56:33.679220 [WARN] serf: Shutdown without a Leave
TestCoordinate_Nodes - 2019/12/30 18:56:33.784793 [WARN] serf: Shutdown without a Leave
TestCoordinate_Nodes - 2019/12/30 18:56:33.859870 [INFO] manager: shutting down
TestCoordinate_Nodes - 2019/12/30 18:56:33.860711 [INFO] agent: consul server down
TestCoordinate_Nodes - 2019/12/30 18:56:33.860783 [INFO] agent: shutdown complete
TestCoordinate_Nodes - 2019/12/30 18:56:33.860863 [INFO] agent: Stopping DNS server 127.0.0.1:18611 (tcp)
TestCoordinate_Nodes - 2019/12/30 18:56:33.861048 [INFO] agent: Stopping DNS server 127.0.0.1:18611 (udp)
TestCoordinate_Nodes - 2019/12/30 18:56:33.861239 [INFO] agent: Stopping HTTP server 127.0.0.1:18612 (tcp)
TestCoordinate_Nodes - 2019/12/30 18:56:33.861460 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Nodes - 2019/12/30 18:56:33.861535 [INFO] agent: Endpoints down
--- PASS: TestCoordinate_Nodes (4.67s)
=== CONT  TestConnectCARoots_list
2019/12/30 18:56:33 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:33 [INFO]  raft: Node at 127.0.0.1:18622 [Leader] entering Leader state
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.935981 [INFO] consul: cluster leadership acquired
TestCoordinate_Disabled_Response - 2019/12/30 18:56:33.936495 [INFO] consul: New leader elected: Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863
TestCoordinate_Node - 2019/12/30 18:56:33.949371 [INFO] agent: Requesting shutdown
TestCoordinate_Node - 2019/12/30 18:56:33.952272 [INFO] consul: shutting down server
TestCoordinate_Node - 2019/12/30 18:56:33.952317 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCARoots_list - 2019/12/30 18:56:33.978066 [WARN] agent: Node name "Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCARoots_list - 2019/12/30 18:56:33.978450 [DEBUG] tlsutil: Update with version 1
TestConnectCARoots_list - 2019/12/30 18:56:33.980789 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Node - 2019/12/30 18:56:34.101401 [WARN] serf: Shutdown without a Leave
TestCoordinate_Node - 2019/12/30 18:56:34.176566 [INFO] manager: shutting down
TestCoordinate_Node - 2019/12/30 18:56:34.177093 [INFO] agent: consul server down
TestCoordinate_Node - 2019/12/30 18:56:34.177175 [INFO] agent: shutdown complete
TestCoordinate_Node - 2019/12/30 18:56:34.177242 [INFO] agent: Stopping DNS server 127.0.0.1:18605 (tcp)
TestCoordinate_Node - 2019/12/30 18:56:34.177417 [INFO] agent: Stopping DNS server 127.0.0.1:18605 (udp)
TestCoordinate_Node - 2019/12/30 18:56:34.177635 [INFO] agent: Stopping HTTP server 127.0.0.1:18606 (tcp)
TestCoordinate_Node - 2019/12/30 18:56:34.178623 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Node - 2019/12/30 18:56:34.178722 [INFO] agent: Endpoints down
--- PASS: TestCoordinate_Node (5.01s)
=== CONT  TestConnectCARoots_empty
TestCoordinate_Disabled_Response - 2019/12/30 18:56:34.310753 [INFO] agent: Synced node info
2019/12/30 18:56:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:21831f9e-a2ba-bbf9-86bd-a4e19c03a46e Address:127.0.0.1:18628}]
2019/12/30 18:56:34 [INFO]  raft: Node at 127.0.0.1:18628 [Follower] entering Follower state (Leader: "")
TestConnectCAConfig - 2019/12/30 18:56:34.315092 [INFO] serf: EventMemberJoin: Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e.dc1 127.0.0.1
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCARoots_empty - 2019/12/30 18:56:34.319503 [WARN] agent: Node name "Node ab13067f-9485-bd89-c437-fe4ed62e7197" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCAConfig - 2019/12/30 18:56:34.319980 [INFO] serf: EventMemberJoin: Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e 127.0.0.1
TestConnectCAConfig - 2019/12/30 18:56:34.320934 [INFO] consul: Handled member-join event for server "Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e.dc1" in area "wan"
TestConnectCAConfig - 2019/12/30 18:56:34.321251 [INFO] consul: Adding LAN server Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e (Addr: tcp/127.0.0.1:18628) (DC: dc1)
TestConnectCAConfig - 2019/12/30 18:56:34.321319 [INFO] agent: Started DNS server 127.0.0.1:18623 (udp)
TestConnectCARoots_empty - 2019/12/30 18:56:34.321806 [DEBUG] tlsutil: Update with version 1
TestConnectCAConfig - 2019/12/30 18:56:34.326312 [INFO] agent: Started DNS server 127.0.0.1:18623 (tcp)
TestConnectCAConfig - 2019/12/30 18:56:34.328929 [INFO] agent: Started HTTP server on 127.0.0.1:18624 (tcp)
TestConnectCAConfig - 2019/12/30 18:56:34.329114 [INFO] agent: started state syncer
TestConnectCARoots_empty - 2019/12/30 18:56:34.329189 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:34 [INFO]  raft: Node at 127.0.0.1:18628 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:34 [INFO]  raft: Node at 127.0.0.1:18628 [Leader] entering Leader state
2019/12/30 18:56:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:922a5526-d46b-d8e7-9e0b-92ff52a1195a Address:127.0.0.1:18634}]
TestConnectCARoots_list - 2019/12/30 18:56:34.938840 [INFO] serf: EventMemberJoin: Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a.dc1 127.0.0.1
TestConnectCAConfig - 2019/12/30 18:56:34.939601 [INFO] consul: cluster leadership acquired
TestConnectCAConfig - 2019/12/30 18:56:34.940094 [INFO] consul: New leader elected: Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e
2019/12/30 18:56:34 [INFO]  raft: Node at 127.0.0.1:18634 [Follower] entering Follower state (Leader: "")
TestConnectCARoots_list - 2019/12/30 18:56:34.954937 [INFO] serf: EventMemberJoin: Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a 127.0.0.1
TestConnectCARoots_list - 2019/12/30 18:56:34.956115 [INFO] consul: Handled member-join event for server "Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a.dc1" in area "wan"
TestConnectCARoots_list - 2019/12/30 18:56:34.956243 [INFO] consul: Adding LAN server Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a (Addr: tcp/127.0.0.1:18634) (DC: dc1)
TestConnectCARoots_list - 2019/12/30 18:56:34.962657 [INFO] agent: Started DNS server 127.0.0.1:18629 (tcp)
TestConnectCARoots_list - 2019/12/30 18:56:34.962744 [INFO] agent: Started DNS server 127.0.0.1:18629 (udp)
2019/12/30 18:56:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:34 [INFO]  raft: Node at 127.0.0.1:18634 [Candidate] entering Candidate state in term 2
TestConnectCARoots_list - 2019/12/30 18:56:34.986187 [INFO] agent: Started HTTP server on 127.0.0.1:18630 (tcp)
TestConnectCARoots_list - 2019/12/30 18:56:34.986328 [INFO] agent: started state syncer
2019/12/30 18:56:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ab13067f-9485-bd89-c437-fe4ed62e7197 Address:127.0.0.1:18640}]
2019/12/30 18:56:35 [INFO]  raft: Node at 127.0.0.1:18640 [Follower] entering Follower state (Leader: "")
TestConnectCARoots_empty - 2019/12/30 18:56:35.311384 [INFO] serf: EventMemberJoin: Node ab13067f-9485-bd89-c437-fe4ed62e7197.dc1 127.0.0.1
TestConnectCARoots_empty - 2019/12/30 18:56:35.340668 [INFO] serf: EventMemberJoin: Node ab13067f-9485-bd89-c437-fe4ed62e7197 127.0.0.1
TestConnectCARoots_empty - 2019/12/30 18:56:35.359489 [INFO] agent: Started DNS server 127.0.0.1:18635 (udp)
TestConnectCARoots_empty - 2019/12/30 18:56:35.361875 [INFO] consul: Handled member-join event for server "Node ab13067f-9485-bd89-c437-fe4ed62e7197.dc1" in area "wan"
TestConnectCARoots_empty - 2019/12/30 18:56:35.369340 [INFO] consul: Adding LAN server Node ab13067f-9485-bd89-c437-fe4ed62e7197 (Addr: tcp/127.0.0.1:18640) (DC: dc1)
2019/12/30 18:56:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:35 [INFO]  raft: Node at 127.0.0.1:18640 [Candidate] entering Candidate state in term 2
TestConnectCARoots_empty - 2019/12/30 18:56:35.381773 [INFO] agent: Started DNS server 127.0.0.1:18635 (tcp)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.392948 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.393612 [DEBUG] consul: Skipping self join check for "Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863" since the cluster is too small
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.393844 [INFO] consul: member 'Node 9d29210b-1f1c-0005-51d3-8f47dbbd2863' joined, marking health alive
TestConnectCARoots_empty - 2019/12/30 18:56:35.432741 [INFO] agent: Started HTTP server on 127.0.0.1:18636 (tcp)
TestConnectCARoots_empty - 2019/12/30 18:56:35.432850 [INFO] agent: started state syncer
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.662162 [INFO] agent: Requesting shutdown
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.662277 [INFO] consul: shutting down server
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.662347 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:35 [INFO]  raft: Node at 127.0.0.1:18634 [Leader] entering Leader state
TestConnectCARoots_list - 2019/12/30 18:56:35.711576 [INFO] consul: cluster leadership acquired
TestConnectCARoots_list - 2019/12/30 18:56:35.712173 [INFO] consul: New leader elected: Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a
TestConnectCAConfig - 2019/12/30 18:56:35.713405 [INFO] agent: Synced node info
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.809786 [WARN] serf: Shutdown without a Leave
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.887510 [INFO] manager: shutting down
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.887948 [INFO] agent: consul server down
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.888000 [INFO] agent: shutdown complete
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.888054 [INFO] agent: Stopping DNS server 127.0.0.1:18617 (tcp)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.888195 [INFO] agent: Stopping DNS server 127.0.0.1:18617 (udp)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.888343 [INFO] agent: Stopping HTTP server 127.0.0.1:18618 (tcp)
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.888539 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Disabled_Response - 2019/12/30 18:56:35.888609 [INFO] agent: Endpoints down
--- PASS: TestCoordinate_Disabled_Response (3.81s)
=== CONT  TestConfig_Apply_Decoding
WARNING: bootstrap = true: do not enable unless necessary
TestConfig_Apply_Decoding - 2019/12/30 18:56:35.980257 [WARN] agent: Node name "Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfig_Apply_Decoding - 2019/12/30 18:56:35.980740 [DEBUG] tlsutil: Update with version 1
TestConfig_Apply_Decoding - 2019/12/30 18:56:35.983582 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:36 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:36 [INFO]  raft: Node at 127.0.0.1:18640 [Leader] entering Leader state
TestConnectCARoots_empty - 2019/12/30 18:56:36.062567 [INFO] consul: cluster leadership acquired
TestConnectCARoots_empty - 2019/12/30 18:56:36.063194 [INFO] consul: New leader elected: Node ab13067f-9485-bd89-c437-fe4ed62e7197
TestConnectCARoots_list - 2019/12/30 18:56:36.236690 [INFO] agent: Synced node info
TestConnectCARoots_empty - 2019/12/30 18:56:36.627479 [INFO] agent: Synced node info
TestConnectCARoots_empty - 2019/12/30 18:56:36.627603 [DEBUG] agent: Node info in sync
TestConnectCAConfig - 2019/12/30 18:56:36.785428 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCARoots_empty - 2019/12/30 18:56:36.785615 [DEBUG] consul: Skipping self join check for "Node ab13067f-9485-bd89-c437-fe4ed62e7197" since the cluster is too small
TestConnectCARoots_empty - 2019/12/30 18:56:36.785844 [INFO] consul: member 'Node ab13067f-9485-bd89-c437-fe4ed62e7197' joined, marking health alive
TestConnectCAConfig - 2019/12/30 18:56:36.786746 [DEBUG] consul: Skipping self join check for "Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e" since the cluster is too small
TestConnectCAConfig - 2019/12/30 18:56:36.786927 [INFO] consul: member 'Node 21831f9e-a2ba-bbf9-86bd-a4e19c03a46e' joined, marking health alive
TestConnectCAConfig - 2019/12/30 18:56:36.835931 [DEBUG] agent: Node info in sync
TestConnectCAConfig - 2019/12/30 18:56:36.836050 [DEBUG] agent: Node info in sync
TestConnectCARoots_empty - 2019/12/30 18:56:37.005914 [INFO] agent: Requesting shutdown
TestConnectCARoots_empty - 2019/12/30 18:56:37.006022 [INFO] consul: shutting down server
TestConnectCARoots_empty - 2019/12/30 18:56:37.006075 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7ad89238-93d3-dcb1-2e3d-68c77568c0f4 Address:127.0.0.1:18646}]
2019/12/30 18:56:37 [INFO]  raft: Node at 127.0.0.1:18646 [Follower] entering Follower state (Leader: "")
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.088883 [INFO] serf: EventMemberJoin: Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4.dc1 127.0.0.1
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.093402 [INFO] serf: EventMemberJoin: Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4 127.0.0.1
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.094891 [INFO] consul: Adding LAN server Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4 (Addr: tcp/127.0.0.1:18646) (DC: dc1)
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.095064 [INFO] consul: Handled member-join event for server "Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4.dc1" in area "wan"
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.096358 [INFO] agent: Started DNS server 127.0.0.1:18641 (udp)
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.096847 [INFO] agent: Started DNS server 127.0.0.1:18641 (tcp)
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.101799 [INFO] agent: Started HTTP server on 127.0.0.1:18642 (tcp)
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.101928 [INFO] agent: started state syncer
2019/12/30 18:56:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:37 [INFO]  raft: Node at 127.0.0.1:18646 [Candidate] entering Candidate state in term 2
TestConnectCARoots_empty - 2019/12/30 18:56:37.168218 [WARN] serf: Shutdown without a Leave
TestConnectCARoots_empty - 2019/12/30 18:56:37.318256 [INFO] manager: shutting down
TestConnectCAConfig - 2019/12/30 18:56:37.318733 [INFO] connect: CA provider config updated
TestConnectCARoots_empty - 2019/12/30 18:56:37.318756 [INFO] agent: consul server down
TestConnectCARoots_empty - 2019/12/30 18:56:37.318947 [INFO] agent: shutdown complete
TestConnectCARoots_empty - 2019/12/30 18:56:37.319033 [INFO] agent: Stopping DNS server 127.0.0.1:18635 (tcp)
TestConnectCARoots_empty - 2019/12/30 18:56:37.319254 [INFO] agent: Stopping DNS server 127.0.0.1:18635 (udp)
TestConnectCARoots_empty - 2019/12/30 18:56:37.319477 [INFO] agent: Stopping HTTP server 127.0.0.1:18636 (tcp)
TestConnectCAConfig - 2019/12/30 18:56:37.319604 [INFO] agent: Requesting shutdown
TestConnectCAConfig - 2019/12/30 18:56:37.319678 [INFO] consul: shutting down server
TestConnectCARoots_empty - 2019/12/30 18:56:37.319702 [INFO] agent: Waiting for endpoints to shut down
TestConnectCAConfig - 2019/12/30 18:56:37.319752 [WARN] serf: Shutdown without a Leave
TestConnectCARoots_empty - 2019/12/30 18:56:37.319768 [INFO] agent: Endpoints down
--- PASS: TestConnectCARoots_empty (3.14s)
=== CONT  TestConfig_Apply_CAS
TestConnectCAConfig - 2019/12/30 18:56:37.393191 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestConfig_Apply_CAS - 2019/12/30 18:56:37.442081 [WARN] agent: Node name "Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfig_Apply_CAS - 2019/12/30 18:56:37.442791 [DEBUG] tlsutil: Update with version 1
TestConfig_Apply_CAS - 2019/12/30 18:56:37.445297 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectCARoots_list - 2019/12/30 18:56:37.477482 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCARoots_list - 2019/12/30 18:56:37.478039 [DEBUG] consul: Skipping self join check for "Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a" since the cluster is too small
TestConnectCARoots_list - 2019/12/30 18:56:37.478215 [INFO] consul: member 'Node 922a5526-d46b-d8e7-9e0b-92ff52a1195a' joined, marking health alive
TestConnectCAConfig - 2019/12/30 18:56:37.479664 [INFO] manager: shutting down
TestConnectCAConfig - 2019/12/30 18:56:37.480136 [INFO] agent: consul server down
TestConnectCAConfig - 2019/12/30 18:56:37.480205 [INFO] agent: shutdown complete
TestConnectCAConfig - 2019/12/30 18:56:37.480270 [INFO] agent: Stopping DNS server 127.0.0.1:18623 (tcp)
TestConnectCAConfig - 2019/12/30 18:56:37.480435 [INFO] agent: Stopping DNS server 127.0.0.1:18623 (udp)
TestConnectCAConfig - 2019/12/30 18:56:37.480652 [INFO] agent: Stopping HTTP server 127.0.0.1:18624 (tcp)
TestConnectCAConfig - 2019/12/30 18:56:37.480940 [INFO] agent: Waiting for endpoints to shut down
TestConnectCAConfig - 2019/12/30 18:56:37.481032 [INFO] agent: Endpoints down
--- PASS: TestConnectCAConfig (4.28s)
=== CONT  TestConfig_Apply
WARNING: bootstrap = true: do not enable unless necessary
TestConfig_Apply - 2019/12/30 18:56:37.622752 [WARN] agent: Node name "Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfig_Apply - 2019/12/30 18:56:37.630412 [DEBUG] tlsutil: Update with version 1
TestConfig_Apply - 2019/12/30 18:56:37.633114 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:37 [INFO]  raft: Node at 127.0.0.1:18646 [Leader] entering Leader state
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.654490 [INFO] consul: cluster leadership acquired
TestConfig_Apply_Decoding - 2019/12/30 18:56:37.654990 [INFO] consul: New leader elected: Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4
jones - 2019/12/30 18:56:37.720240 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:56:37.720338 [DEBUG] agent: Node info in sync
TestConfig_Apply_Decoding - 2019/12/30 18:56:38.094243 [INFO] agent: Synced node info
TestConfig_Apply_Decoding - 2019/12/30 18:56:38.094478 [DEBUG] agent: Node info in sync
TestConfig_Apply_Decoding - 2019/12/30 18:56:38.336475 [DEBUG] agent: Node info in sync
2019/12/30 18:56:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:50778108-6c97-bddc-24e2-cdc9ba9a5a6c Address:127.0.0.1:18652}]
2019/12/30 18:56:38 [INFO]  raft: Node at 127.0.0.1:18652 [Follower] entering Follower state (Leader: "")
TestConfig_Apply_CAS - 2019/12/30 18:56:38.509677 [INFO] serf: EventMemberJoin: Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c.dc1 127.0.0.1
TestConfig_Apply_CAS - 2019/12/30 18:56:38.517889 [INFO] serf: EventMemberJoin: Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c 127.0.0.1
TestConfig_Apply_CAS - 2019/12/30 18:56:38.526177 [INFO] consul: Adding LAN server Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c (Addr: tcp/127.0.0.1:18652) (DC: dc1)
TestConfig_Apply_CAS - 2019/12/30 18:56:38.530027 [INFO] consul: Handled member-join event for server "Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c.dc1" in area "wan"
TestConfig_Apply_CAS - 2019/12/30 18:56:38.531664 [INFO] agent: Started DNS server 127.0.0.1:18647 (tcp)
TestConfig_Apply_CAS - 2019/12/30 18:56:38.532918 [INFO] agent: Started DNS server 127.0.0.1:18647 (udp)
TestConfig_Apply_CAS - 2019/12/30 18:56:38.536519 [INFO] agent: Started HTTP server on 127.0.0.1:18648 (tcp)
TestConfig_Apply_CAS - 2019/12/30 18:56:38.536776 [INFO] agent: started state syncer
2019/12/30 18:56:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:38 [INFO]  raft: Node at 127.0.0.1:18652 [Candidate] entering Candidate state in term 2
TestConnectCARoots_list - 2019/12/30 18:56:38.603929 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c0ebbd8c-7c8c-9545-7261-d06f055cabd5 Address:127.0.0.1:18658}]
2019/12/30 18:56:38 [INFO]  raft: Node at 127.0.0.1:18658 [Follower] entering Follower state (Leader: "")
TestConfig_Apply - 2019/12/30 18:56:38.764759 [INFO] serf: EventMemberJoin: Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5.dc1 127.0.0.1
TestConfig_Apply - 2019/12/30 18:56:38.768342 [INFO] serf: EventMemberJoin: Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5 127.0.0.1
TestConfig_Apply - 2019/12/30 18:56:38.769178 [INFO] consul: Handled member-join event for server "Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5.dc1" in area "wan"
TestConfig_Apply - 2019/12/30 18:56:38.769579 [INFO] consul: Adding LAN server Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5 (Addr: tcp/127.0.0.1:18658) (DC: dc1)
TestConfig_Apply - 2019/12/30 18:56:38.769782 [INFO] agent: Started DNS server 127.0.0.1:18653 (udp)
TestConfig_Apply - 2019/12/30 18:56:38.770131 [INFO] agent: Started DNS server 127.0.0.1:18653 (tcp)
TestConfig_Apply - 2019/12/30 18:56:38.772454 [INFO] agent: Started HTTP server on 127.0.0.1:18654 (tcp)
TestConfig_Apply - 2019/12/30 18:56:38.772554 [INFO] agent: started state syncer
2019/12/30 18:56:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:38 [INFO]  raft: Node at 127.0.0.1:18658 [Candidate] entering Candidate state in term 2
TestConnectCARoots_list - 2019/12/30 18:56:38.938220 [INFO] connect: CA rotated to new root under provider "consul"
TestConnectCARoots_list - 2019/12/30 18:56:38.938898 [INFO] agent: Requesting shutdown
TestConnectCARoots_list - 2019/12/30 18:56:38.938989 [INFO] consul: shutting down server
TestConnectCARoots_list - 2019/12/30 18:56:38.939044 [WARN] serf: Shutdown without a Leave
TestConnectCARoots_list - 2019/12/30 18:56:39.078628 [WARN] serf: Shutdown without a Leave
TestConnectCARoots_list - 2019/12/30 18:56:39.143399 [INFO] manager: shutting down
TestConnectCARoots_list - 2019/12/30 18:56:39.144286 [INFO] agent: consul server down
TestConnectCARoots_list - 2019/12/30 18:56:39.144354 [INFO] agent: shutdown complete
TestConnectCARoots_list - 2019/12/30 18:56:39.144485 [INFO] agent: Stopping DNS server 127.0.0.1:18629 (tcp)
2019/12/30 18:56:39 [INFO]  raft: Election won. Tally: 1
TestConnectCARoots_list - 2019/12/30 18:56:39.144668 [INFO] agent: Stopping DNS server 127.0.0.1:18629 (udp)
2019/12/30 18:56:39 [INFO]  raft: Node at 127.0.0.1:18652 [Leader] entering Leader state
TestConnectCARoots_list - 2019/12/30 18:56:39.144886 [INFO] agent: Stopping HTTP server 127.0.0.1:18630 (tcp)
TestConnectCARoots_list - 2019/12/30 18:56:39.145131 [INFO] agent: Waiting for endpoints to shut down
TestConnectCARoots_list - 2019/12/30 18:56:39.145219 [INFO] agent: Endpoints down
--- PASS: TestConnectCARoots_list (5.28s)
=== CONT  TestConfig_Delete
TestConfig_Apply_CAS - 2019/12/30 18:56:39.146111 [INFO] consul: cluster leadership acquired
TestConfig_Apply_CAS - 2019/12/30 18:56:39.146588 [INFO] consul: New leader elected: Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c
WARNING: bootstrap = true: do not enable unless necessary
TestConfig_Delete - 2019/12/30 18:56:39.205516 [WARN] agent: Node name "Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfig_Delete - 2019/12/30 18:56:39.206195 [DEBUG] tlsutil: Update with version 1
TestConfig_Delete - 2019/12/30 18:56:39.208690 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.211500 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.211994 [DEBUG] consul: Skipping self join check for "Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4" since the cluster is too small
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.212149 [INFO] consul: member 'Node 7ad89238-93d3-dcb1-2e3d-68c77568c0f4' joined, marking health alive
2019/12/30 18:56:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:39 [INFO]  raft: Node at 127.0.0.1:18658 [Leader] entering Leader state
TestConfig_Apply - 2019/12/30 18:56:39.385808 [INFO] consul: cluster leadership acquired
TestConfig_Apply - 2019/12/30 18:56:39.386251 [INFO] consul: New leader elected: Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.386247 [INFO] agent: Requesting shutdown
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.386352 [INFO] consul: shutting down server
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.386405 [WARN] serf: Shutdown without a Leave
TestConfig_Apply_CAS - 2019/12/30 18:56:39.485956 [INFO] agent: Synced node info
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.568374 [WARN] serf: Shutdown without a Leave
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.643261 [INFO] manager: shutting down
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.643665 [INFO] agent: consul server down
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.643718 [INFO] agent: shutdown complete
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.643772 [INFO] agent: Stopping DNS server 127.0.0.1:18641 (tcp)
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.643909 [INFO] agent: Stopping DNS server 127.0.0.1:18641 (udp)
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.644079 [INFO] agent: Stopping HTTP server 127.0.0.1:18642 (tcp)
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.644332 [INFO] agent: Waiting for endpoints to shut down
TestConfig_Apply_Decoding - 2019/12/30 18:56:39.644377 [INFO] agent: Endpoints down
--- PASS: TestConfig_Apply_Decoding (3.76s)
=== CONT  TestConfig_Get
WARNING: bootstrap = true: do not enable unless necessary
TestConfig_Get - 2019/12/30 18:56:39.703932 [WARN] agent: Node name "Node 34dca06e-a645-e79e-7bca-dbce3eea55d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfig_Get - 2019/12/30 18:56:39.704479 [DEBUG] tlsutil: Update with version 1
TestConfig_Get - 2019/12/30 18:56:39.706719 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfig_Apply - 2019/12/30 18:56:39.811050 [INFO] agent: Synced node info
2019/12/30 18:56:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:527c21f4-b6b1-f4f4-3f70-b07d1d2e0462 Address:127.0.0.1:18664}]
2019/12/30 18:56:40 [INFO]  raft: Node at 127.0.0.1:18664 [Follower] entering Follower state (Leader: "")
TestConfig_Delete - 2019/12/30 18:56:40.147280 [INFO] serf: EventMemberJoin: Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462.dc1 127.0.0.1
TestConfig_Delete - 2019/12/30 18:56:40.151313 [INFO] serf: EventMemberJoin: Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462 127.0.0.1
TestConfig_Delete - 2019/12/30 18:56:40.152784 [INFO] consul: Handled member-join event for server "Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462.dc1" in area "wan"
TestConfig_Delete - 2019/12/30 18:56:40.152882 [INFO] agent: Started DNS server 127.0.0.1:18659 (udp)
TestConfig_Delete - 2019/12/30 18:56:40.153754 [INFO] agent: Started DNS server 127.0.0.1:18659 (tcp)
TestConfig_Delete - 2019/12/30 18:56:40.153379 [INFO] consul: Adding LAN server Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462 (Addr: tcp/127.0.0.1:18664) (DC: dc1)
TestConfig_Delete - 2019/12/30 18:56:40.157968 [INFO] agent: Started HTTP server on 127.0.0.1:18660 (tcp)
TestConfig_Delete - 2019/12/30 18:56:40.158274 [INFO] agent: started state syncer
2019/12/30 18:56:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:40 [INFO]  raft: Node at 127.0.0.1:18664 [Candidate] entering Candidate state in term 2
TestConfig_Apply_CAS - 2019/12/30 18:56:40.552357 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConfig_Apply_CAS - 2019/12/30 18:56:40.564261 [DEBUG] consul: Skipping self join check for "Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c" since the cluster is too small
TestConfig_Apply_CAS - 2019/12/30 18:56:40.564946 [INFO] consul: member 'Node 50778108-6c97-bddc-24e2-cdc9ba9a5a6c' joined, marking health alive
2019/12/30 18:56:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:34dca06e-a645-e79e-7bca-dbce3eea55d7 Address:127.0.0.1:18670}]
2019/12/30 18:56:40 [INFO]  raft: Node at 127.0.0.1:18670 [Follower] entering Follower state (Leader: "")
TestConfig_Get - 2019/12/30 18:56:40.670131 [INFO] serf: EventMemberJoin: Node 34dca06e-a645-e79e-7bca-dbce3eea55d7.dc1 127.0.0.1
TestConfig_Get - 2019/12/30 18:56:40.679676 [INFO] serf: EventMemberJoin: Node 34dca06e-a645-e79e-7bca-dbce3eea55d7 127.0.0.1
TestConfig_Get - 2019/12/30 18:56:40.681155 [INFO] consul: Adding LAN server Node 34dca06e-a645-e79e-7bca-dbce3eea55d7 (Addr: tcp/127.0.0.1:18670) (DC: dc1)
TestConfig_Get - 2019/12/30 18:56:40.681354 [INFO] consul: Handled member-join event for server "Node 34dca06e-a645-e79e-7bca-dbce3eea55d7.dc1" in area "wan"
TestConfig_Get - 2019/12/30 18:56:40.682913 [INFO] agent: Started DNS server 127.0.0.1:18665 (tcp)
TestConfig_Get - 2019/12/30 18:56:40.683588 [INFO] agent: Started DNS server 127.0.0.1:18665 (udp)
TestConfig_Get - 2019/12/30 18:56:40.686555 [INFO] agent: Started HTTP server on 127.0.0.1:18666 (tcp)
TestConfig_Get - 2019/12/30 18:56:40.687044 [INFO] agent: started state syncer
2019/12/30 18:56:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:40 [INFO]  raft: Node at 127.0.0.1:18670 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:56:41.010842 [DEBUG] consul: Skipping self join check for "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" since the cluster is too small
2019/12/30 18:56:41 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:41 [INFO]  raft: Node at 127.0.0.1:18664 [Leader] entering Leader state
TestConfig_Delete - 2019/12/30 18:56:41.012004 [INFO] consul: cluster leadership acquired
TestConfig_Delete - 2019/12/30 18:56:41.012351 [INFO] consul: New leader elected: Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462
TestConfig_Apply_CAS - 2019/12/30 18:56:41.944787 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestConfig_Apply_CAS - 2019/12/30 18:56:41.944881 [DEBUG] agent: Node info in sync
TestConfig_Apply_CAS - 2019/12/30 18:56:41.944966 [DEBUG] agent: Node info in sync
TestConfig_Apply - 2019/12/30 18:56:42.376976 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfig_Apply - 2019/12/30 18:56:42.377424 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConfig_Apply - 2019/12/30 18:56:42.377855 [DEBUG] consul: Skipping self join check for "Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5" since the cluster is too small
TestConfig_Apply - 2019/12/30 18:56:42.378015 [INFO] consul: member 'Node c0ebbd8c-7c8c-9545-7261-d06f055cabd5' joined, marking health alive
TestConfig_Apply_CAS - 2019/12/30 18:56:42.487031 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:42 [INFO]  raft: Node at 127.0.0.1:18670 [Leader] entering Leader state
TestConfig_Apply - 2019/12/30 18:56:42.769360 [INFO] agent: Requesting shutdown
TestConfig_Apply - 2019/12/30 18:56:42.769517 [INFO] consul: shutting down server
TestConfig_Apply - 2019/12/30 18:56:42.769575 [WARN] serf: Shutdown without a Leave
TestConfig_Apply - 2019/12/30 18:56:42.771133 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestConfig_Apply - 2019/12/30 18:56:42.771207 [DEBUG] agent: Node info in sync
TestConfig_Apply - 2019/12/30 18:56:42.771284 [DEBUG] agent: Node info in sync
TestConfig_Delete - 2019/12/30 18:56:42.772268 [INFO] agent: Synced node info
TestConfig_Get - 2019/12/30 18:56:42.775000 [INFO] consul: cluster leadership acquired
TestConfig_Get - 2019/12/30 18:56:42.775456 [INFO] consul: New leader elected: Node 34dca06e-a645-e79e-7bca-dbce3eea55d7
TestConfig_Apply - 2019/12/30 18:56:42.943252 [WARN] serf: Shutdown without a Leave
TestConfig_Apply_CAS - 2019/12/30 18:56:42.948147 [INFO] agent: Requesting shutdown
TestConfig_Apply_CAS - 2019/12/30 18:56:42.948255 [INFO] consul: shutting down server
TestConfig_Apply_CAS - 2019/12/30 18:56:42.948303 [WARN] serf: Shutdown without a Leave
TestConfig_Apply - 2019/12/30 18:56:43.060007 [INFO] manager: shutting down
TestConfig_Apply - 2019/12/30 18:56:43.060702 [INFO] agent: consul server down
TestConfig_Apply - 2019/12/30 18:56:43.060781 [INFO] agent: shutdown complete
TestConfig_Apply - 2019/12/30 18:56:43.060848 [INFO] agent: Stopping DNS server 127.0.0.1:18653 (tcp)
TestConfig_Apply - 2019/12/30 18:56:43.060997 [INFO] agent: Stopping DNS server 127.0.0.1:18653 (udp)
TestConfig_Apply - 2019/12/30 18:56:43.061162 [INFO] agent: Stopping HTTP server 127.0.0.1:18654 (tcp)
TestConfig_Apply - 2019/12/30 18:56:43.061381 [INFO] agent: Waiting for endpoints to shut down
TestConfig_Apply - 2019/12/30 18:56:43.061459 [INFO] agent: Endpoints down
--- PASS: TestConfig_Apply (5.58s)
=== CONT  TestCatalogNodeServices_ConnectProxy
TestConfig_Apply_CAS - 2019/12/30 18:56:43.126661 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:43.141061 [WARN] agent: Node name "Node 90aca5b1-a946-0929-6665-306f12b36f52" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:43.141496 [DEBUG] tlsutil: Update with version 1
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:43.147761 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfig_Get - 2019/12/30 18:56:43.236082 [INFO] agent: Synced node info
TestConfig_Get - 2019/12/30 18:56:43.236214 [DEBUG] agent: Node info in sync
TestConfig_Apply_CAS - 2019/12/30 18:56:43.236566 [INFO] manager: shutting down
jones - 2019/12/30 18:56:43.236868 [DEBUG] consul: Skipping self join check for "Node 90e88a15-5862-4de0-2f1f-c638261bac76" since the cluster is too small
TestConfig_Apply_CAS - 2019/12/30 18:56:43.237449 [INFO] agent: consul server down
TestConfig_Apply_CAS - 2019/12/30 18:56:43.237515 [INFO] agent: shutdown complete
TestConfig_Apply_CAS - 2019/12/30 18:56:43.237591 [INFO] agent: Stopping DNS server 127.0.0.1:18647 (tcp)
TestConfig_Apply_CAS - 2019/12/30 18:56:43.237767 [INFO] agent: Stopping DNS server 127.0.0.1:18647 (udp)
TestConfig_Apply_CAS - 2019/12/30 18:56:43.237974 [INFO] agent: Stopping HTTP server 127.0.0.1:18648 (tcp)
TestConfig_Apply_CAS - 2019/12/30 18:56:43.238214 [INFO] agent: Waiting for endpoints to shut down
TestConfig_Apply_CAS - 2019/12/30 18:56:43.238293 [INFO] agent: Endpoints down
--- PASS: TestConfig_Apply_CAS (5.92s)
=== CONT  TestCatalogNodeServices
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodeServices - 2019/12/30 18:56:43.337311 [WARN] agent: Node name "Node 62a99c6a-a04f-b72a-9029-7147cdb5938e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodeServices - 2019/12/30 18:56:43.337867 [DEBUG] tlsutil: Update with version 1
TestCatalogNodeServices - 2019/12/30 18:56:43.341072 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfig_Delete - 2019/12/30 18:56:44.154794 [DEBUG] agent: Node info in sync
TestConfig_Delete - 2019/12/30 18:56:44.154939 [DEBUG] agent: Node info in sync
TestConfig_Delete - 2019/12/30 18:56:44.493986 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConfig_Delete - 2019/12/30 18:56:44.494571 [DEBUG] consul: Skipping self join check for "Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462" since the cluster is too small
TestConfig_Delete - 2019/12/30 18:56:44.494742 [INFO] consul: member 'Node 527c21f4-b6b1-f4f4-3f70-b07d1d2e0462' joined, marking health alive
TestConfig_Get - 2019/12/30 18:56:44.819244 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConfig_Get - 2019/12/30 18:56:44.819889 [DEBUG] consul: Skipping self join check for "Node 34dca06e-a645-e79e-7bca-dbce3eea55d7" since the cluster is too small
TestConfig_Get - 2019/12/30 18:56:44.820069 [INFO] consul: member 'Node 34dca06e-a645-e79e-7bca-dbce3eea55d7' joined, marking health alive
2019/12/30 18:56:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:90aca5b1-a946-0929-6665-306f12b36f52 Address:127.0.0.1:18676}]
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.824091 [INFO] serf: EventMemberJoin: Node 90aca5b1-a946-0929-6665-306f12b36f52.dc1 127.0.0.1
2019/12/30 18:56:44 [INFO]  raft: Node at 127.0.0.1:18676 [Follower] entering Follower state (Leader: "")
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.834232 [INFO] serf: EventMemberJoin: Node 90aca5b1-a946-0929-6665-306f12b36f52 127.0.0.1
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.838751 [INFO] consul: Handled member-join event for server "Node 90aca5b1-a946-0929-6665-306f12b36f52.dc1" in area "wan"
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.839131 [INFO] consul: Adding LAN server Node 90aca5b1-a946-0929-6665-306f12b36f52 (Addr: tcp/127.0.0.1:18676) (DC: dc1)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.839945 [INFO] agent: Started DNS server 127.0.0.1:18671 (udp)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.840114 [INFO] agent: Started DNS server 127.0.0.1:18671 (tcp)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.842999 [INFO] agent: Started HTTP server on 127.0.0.1:18672 (tcp)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:44.843114 [INFO] agent: started state syncer
2019/12/30 18:56:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:44 [INFO]  raft: Node at 127.0.0.1:18676 [Candidate] entering Candidate state in term 2
TestConfig_Get - 2019/12/30 18:56:44.944870 [DEBUG] agent: Node info in sync
2019/12/30 18:56:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:62a99c6a-a04f-b72a-9029-7147cdb5938e Address:127.0.0.1:18682}]
2019/12/30 18:56:45 [INFO]  raft: Node at 127.0.0.1:18682 [Follower] entering Follower state (Leader: "")
TestCatalogNodeServices - 2019/12/30 18:56:45.106695 [INFO] serf: EventMemberJoin: Node 62a99c6a-a04f-b72a-9029-7147cdb5938e.dc1 127.0.0.1
TestCatalogNodeServices - 2019/12/30 18:56:45.114232 [INFO] serf: EventMemberJoin: Node 62a99c6a-a04f-b72a-9029-7147cdb5938e 127.0.0.1
TestCatalogNodeServices - 2019/12/30 18:56:45.115832 [INFO] consul: Adding LAN server Node 62a99c6a-a04f-b72a-9029-7147cdb5938e (Addr: tcp/127.0.0.1:18682) (DC: dc1)
TestCatalogNodeServices - 2019/12/30 18:56:45.117082 [INFO] consul: Handled member-join event for server "Node 62a99c6a-a04f-b72a-9029-7147cdb5938e.dc1" in area "wan"
TestCatalogNodeServices - 2019/12/30 18:56:45.118727 [INFO] agent: Started DNS server 127.0.0.1:18677 (tcp)
TestCatalogNodeServices - 2019/12/30 18:56:45.126359 [INFO] agent: Started DNS server 127.0.0.1:18677 (udp)
TestCatalogNodeServices - 2019/12/30 18:56:45.128861 [INFO] agent: Started HTTP server on 127.0.0.1:18678 (tcp)
TestCatalogNodeServices - 2019/12/30 18:56:45.129062 [INFO] agent: started state syncer
2019/12/30 18:56:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:45 [INFO]  raft: Node at 127.0.0.1:18682 [Candidate] entering Candidate state in term 2
TestConfig_Delete - 2019/12/30 18:56:45.244749 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:56:45.395316 [DEBUG] consul: Skipping self join check for "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" since the cluster is too small
TestConfig_Get - 2019/12/30 18:56:45.485408 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfig_Delete - 2019/12/30 18:56:45.677841 [INFO] agent: Requesting shutdown
TestConfig_Delete - 2019/12/30 18:56:45.677948 [INFO] consul: shutting down server
TestConfig_Delete - 2019/12/30 18:56:45.678000 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:45 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:45 [INFO]  raft: Node at 127.0.0.1:18676 [Leader] entering Leader state
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:45.684666 [INFO] consul: cluster leadership acquired
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:45.685079 [INFO] consul: New leader elected: Node 90aca5b1-a946-0929-6665-306f12b36f52
TestConfig_Delete - 2019/12/30 18:56:45.901690 [WARN] serf: Shutdown without a Leave
TestConfig_Delete - 2019/12/30 18:56:46.118620 [INFO] manager: shutting down
TestConfig_Delete - 2019/12/30 18:56:46.119818 [INFO] agent: consul server down
TestConfig_Delete - 2019/12/30 18:56:46.119896 [INFO] agent: shutdown complete
TestConfig_Delete - 2019/12/30 18:56:46.119970 [INFO] agent: Stopping DNS server 127.0.0.1:18659 (tcp)
TestConfig_Delete - 2019/12/30 18:56:46.120169 [INFO] agent: Stopping DNS server 127.0.0.1:18659 (udp)
TestConfig_Delete - 2019/12/30 18:56:46.120411 [INFO] agent: Stopping HTTP server 127.0.0.1:18660 (tcp)
TestConfig_Delete - 2019/12/30 18:56:46.120705 [INFO] agent: Waiting for endpoints to shut down
TestConfig_Delete - 2019/12/30 18:56:46.120802 [INFO] agent: Endpoints down
--- PASS: TestConfig_Delete (6.98s)
=== CONT  TestCatalogConnectServiceNodes_Filter
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:46.219053 [WARN] agent: Node name "Node 091f7a78-6b32-8b90-5b99-89f93be30069" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:46.219832 [DEBUG] tlsutil: Update with version 1
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:46.222468 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:46 [INFO]  raft: Node at 127.0.0.1:18682 [Leader] entering Leader state
TestCatalogNodeServices - 2019/12/30 18:56:46.432392 [INFO] consul: cluster leadership acquired
TestCatalogNodeServices - 2019/12/30 18:56:46.432858 [INFO] consul: New leader elected: Node 62a99c6a-a04f-b72a-9029-7147cdb5938e
TestConfig_Get - 2019/12/30 18:56:46.435180 [INFO] agent: Requesting shutdown
TestConfig_Get - 2019/12/30 18:56:46.435259 [INFO] consul: shutting down server
TestConfig_Get - 2019/12/30 18:56:46.435309 [WARN] serf: Shutdown without a Leave
TestConfig_Get - 2019/12/30 18:56:46.594116 [WARN] serf: Shutdown without a Leave
TestConfig_Get - 2019/12/30 18:56:46.918672 [INFO] manager: shutting down
TestConfig_Get - 2019/12/30 18:56:46.920021 [INFO] agent: consul server down
TestConfig_Get - 2019/12/30 18:56:46.920101 [INFO] agent: shutdown complete
TestConfig_Get - 2019/12/30 18:56:46.920194 [INFO] agent: Stopping DNS server 127.0.0.1:18665 (tcp)
TestConfig_Get - 2019/12/30 18:56:46.920420 [INFO] agent: Stopping DNS server 127.0.0.1:18665 (udp)
TestConfig_Get - 2019/12/30 18:56:46.920641 [INFO] agent: Stopping HTTP server 127.0.0.1:18666 (tcp)
TestConfig_Get - 2019/12/30 18:56:46.920916 [INFO] agent: Waiting for endpoints to shut down
TestConfig_Get - 2019/12/30 18:56:46.921021 [INFO] agent: Endpoints down
--- PASS: TestConfig_Get (7.28s)
=== CONT  TestCatalogConnectServiceNodes_good
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:46.999867 [WARN] agent: Node name "Node ff276e23-1cc6-ca08-4929-3453cd6b138d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:47.000292 [DEBUG] tlsutil: Update with version 1
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:47.002477 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:47.196803 [INFO] agent: Synced node info
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:47.196926 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:56:47.561374 [DEBUG] consul: Skipping self join check for "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" since the cluster is too small
TestCatalogNodeServices - 2019/12/30 18:56:47.568585 [INFO] agent: Synced node info
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:48.062657 [DEBUG] agent: Node info in sync
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:50.010463 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodeServices - 2019/12/30 18:56:50.104677 [DEBUG] agent: Node info in sync
TestCatalogNodeServices - 2019/12/30 18:56:50.104833 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:56:50.361552 [DEBUG] consul: Skipping self join check for "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" since the cluster is too small
2019/12/30 18:56:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:091f7a78-6b32-8b90-5b99-89f93be30069 Address:127.0.0.1:18688}]
2019/12/30 18:56:50 [INFO]  raft: Node at 127.0.0.1:18688 [Follower] entering Follower state (Leader: "")
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.623611 [INFO] serf: EventMemberJoin: Node 091f7a78-6b32-8b90-5b99-89f93be30069.dc1 127.0.0.1
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.633387 [INFO] serf: EventMemberJoin: Node 091f7a78-6b32-8b90-5b99-89f93be30069 127.0.0.1
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.634489 [INFO] consul: Adding LAN server Node 091f7a78-6b32-8b90-5b99-89f93be30069 (Addr: tcp/127.0.0.1:18688) (DC: dc1)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.634848 [INFO] consul: Handled member-join event for server "Node 091f7a78-6b32-8b90-5b99-89f93be30069.dc1" in area "wan"
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.635154 [INFO] agent: Started DNS server 127.0.0.1:18683 (udp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.635324 [INFO] agent: Started DNS server 127.0.0.1:18683 (tcp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.637780 [INFO] agent: Started HTTP server on 127.0.0.1:18684 (tcp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:50.637899 [INFO] agent: started state syncer
2019/12/30 18:56:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:50 [INFO]  raft: Node at 127.0.0.1:18688 [Candidate] entering Candidate state in term 2
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:50.728100 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:50.728603 [DEBUG] consul: Skipping self join check for "Node 90aca5b1-a946-0929-6665-306f12b36f52" since the cluster is too small
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:50.729020 [INFO] consul: member 'Node 90aca5b1-a946-0929-6665-306f12b36f52' joined, marking health alive
TestCatalogNodeServices - 2019/12/30 18:56:50.732139 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ff276e23-1cc6-ca08-4929-3453cd6b138d Address:127.0.0.1:18694}]
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.736053 [INFO] serf: EventMemberJoin: Node ff276e23-1cc6-ca08-4929-3453cd6b138d.dc1 127.0.0.1
2019/12/30 18:56:50 [INFO]  raft: Node at 127.0.0.1:18694 [Follower] entering Follower state (Leader: "")
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.739275 [INFO] serf: EventMemberJoin: Node ff276e23-1cc6-ca08-4929-3453cd6b138d 127.0.0.1
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.740055 [INFO] consul: Handled member-join event for server "Node ff276e23-1cc6-ca08-4929-3453cd6b138d.dc1" in area "wan"
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.740367 [INFO] consul: Adding LAN server Node ff276e23-1cc6-ca08-4929-3453cd6b138d (Addr: tcp/127.0.0.1:18694) (DC: dc1)
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.740550 [INFO] agent: Started DNS server 127.0.0.1:18689 (udp)
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.740883 [INFO] agent: Started DNS server 127.0.0.1:18689 (tcp)
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.743192 [INFO] agent: Started HTTP server on 127.0.0.1:18690 (tcp)
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:50.743277 [INFO] agent: started state syncer
2019/12/30 18:56:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:50 [INFO]  raft: Node at 127.0.0.1:18694 [Candidate] entering Candidate state in term 2
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.297014 [INFO] agent: Requesting shutdown
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.297116 [INFO] consul: shutting down server
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.297167 [WARN] serf: Shutdown without a Leave
TestCatalogNodeServices - 2019/12/30 18:56:51.302299 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodeServices - 2019/12/30 18:56:51.302848 [DEBUG] consul: Skipping self join check for "Node 62a99c6a-a04f-b72a-9029-7147cdb5938e" since the cluster is too small
TestCatalogNodeServices - 2019/12/30 18:56:51.303008 [INFO] consul: member 'Node 62a99c6a-a04f-b72a-9029-7147cdb5938e' joined, marking health alive
2019/12/30 18:56:51 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:51 [INFO]  raft: Node at 127.0.0.1:18688 [Leader] entering Leader state
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.397119 [WARN] serf: Shutdown without a Leave
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:51.398744 [INFO] consul: cluster leadership acquired
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:51.399306 [INFO] consul: New leader elected: Node 091f7a78-6b32-8b90-5b99-89f93be30069
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.476968 [INFO] manager: shutting down
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.477715 [INFO] agent: consul server down
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.477776 [INFO] agent: shutdown complete
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.477833 [INFO] agent: Stopping DNS server 127.0.0.1:18671 (tcp)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.477978 [INFO] agent: Stopping DNS server 127.0.0.1:18671 (udp)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.478135 [INFO] agent: Stopping HTTP server 127.0.0.1:18672 (tcp)
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.478333 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodeServices_ConnectProxy - 2019/12/30 18:56:51.478404 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodeServices_ConnectProxy (8.42s)
=== CONT  TestCatalogServiceNodes_ConnectProxy
2019/12/30 18:56:51 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:51 [INFO]  raft: Node at 127.0.0.1:18694 [Leader] entering Leader state
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:51.486344 [INFO] consul: cluster leadership acquired
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:51.486792 [INFO] consul: New leader elected: Node ff276e23-1cc6-ca08-4929-3453cd6b138d
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:51.576792 [WARN] agent: Node name "Node 01e5a8b9-abbf-383f-a6a9-b3a33d5168d5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:51.577384 [DEBUG] tlsutil: Update with version 1
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:51.580469 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:51.777634 [INFO] agent: Synced node info
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:51.777751 [DEBUG] agent: Node info in sync
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:52.621083 [INFO] agent: Synced node info
TestCatalogNodeServices - 2019/12/30 18:56:52.787038 [INFO] agent: Requesting shutdown
TestCatalogNodeServices - 2019/12/30 18:56:52.787132 [INFO] consul: shutting down server
TestCatalogNodeServices - 2019/12/30 18:56:52.787182 [WARN] serf: Shutdown without a Leave
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:52.817545 [DEBUG] agent: Node info in sync
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:52.817673 [DEBUG] agent: Node info in sync
TestCatalogNodeServices - 2019/12/30 18:56:52.885170 [WARN] serf: Shutdown without a Leave
TestCatalogNodeServices - 2019/12/30 18:56:52.977003 [INFO] manager: shutting down
TestCatalogNodeServices - 2019/12/30 18:56:52.979011 [INFO] agent: consul server down
TestCatalogNodeServices - 2019/12/30 18:56:52.979133 [INFO] agent: shutdown complete
TestCatalogNodeServices - 2019/12/30 18:56:52.979457 [INFO] agent: Stopping DNS server 127.0.0.1:18677 (tcp)
TestCatalogNodeServices - 2019/12/30 18:56:52.979893 [INFO] agent: Stopping DNS server 127.0.0.1:18677 (udp)
TestCatalogNodeServices - 2019/12/30 18:56:52.980239 [INFO] agent: Stopping HTTP server 127.0.0.1:18678 (tcp)
TestCatalogNodeServices - 2019/12/30 18:56:52.980488 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodeServices - 2019/12/30 18:56:52.980717 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodeServices (9.74s)
=== CONT  TestCatalogServiceNodes_DistanceSort
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:53.042243 [WARN] agent: Node name "Node 9b42bd13-0b35-1566-2adb-46af2525a488" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:53.042997 [DEBUG] tlsutil: Update with version 1
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:53.048478 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.100006 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:56:53.253219 [DEBUG] consul: Skipping self join check for "Node 5122c9d8-8979-c841-956f-094a90e62880" since the cluster is too small
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.254280 [INFO] agent: Requesting shutdown
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.254360 [INFO] consul: shutting down server
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.254592 [WARN] serf: Shutdown without a Leave
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.335208 [WARN] serf: Shutdown without a Leave
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.426943 [INFO] manager: shutting down
2019/12/30 18:56:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:01e5a8b9-abbf-383f-a6a9-b3a33d5168d5 Address:127.0.0.1:18700}]
2019/12/30 18:56:53 [INFO]  raft: Node at 127.0.0.1:18700 [Follower] entering Follower state (Leader: "")
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.431279 [INFO] serf: EventMemberJoin: Node 01e5a8b9-abbf-383f-a6a9-b3a33d5168d5.dc1 127.0.0.1
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.434660 [INFO] serf: EventMemberJoin: Node 01e5a8b9-abbf-383f-a6a9-b3a33d5168d5 127.0.0.1
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.435899 [INFO] consul: Adding LAN server Node 01e5a8b9-abbf-383f-a6a9-b3a33d5168d5 (Addr: tcp/127.0.0.1:18700) (DC: dc1)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.435945 [INFO] agent: Started DNS server 127.0.0.1:18695 (udp)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.436398 [INFO] consul: Handled member-join event for server "Node 01e5a8b9-abbf-383f-a6a9-b3a33d5168d5.dc1" in area "wan"
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.436424 [INFO] agent: Started DNS server 127.0.0.1:18695 (tcp)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.439021 [INFO] agent: Started HTTP server on 127.0.0.1:18696 (tcp)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:53.439132 [INFO] agent: started state syncer
2019/12/30 18:56:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:53 [INFO]  raft: Node at 127.0.0.1:18700 [Candidate] entering Candidate state in term 2
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.528761 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.528805 [INFO] agent: consul server down
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.528949 [INFO] agent: shutdown complete
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.529002 [INFO] agent: Stopping DNS server 127.0.0.1:18689 (tcp)
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.529141 [INFO] agent: Stopping DNS server 127.0.0.1:18689 (udp)
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.529286 [INFO] agent: Stopping HTTP server 127.0.0.1:18690 (tcp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.530757 [INFO] agent: Requesting shutdown
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.530842 [INFO] consul: shutting down server
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.530886 [WARN] serf: Shutdown without a Leave
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.529018 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.534372 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.534509 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.534573 [ERR] consul: failed to transfer leadership in 3 attempts
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.651164 [WARN] serf: Shutdown without a Leave
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.692122 [INFO] agent: Waiting for endpoints to shut down
TestCatalogConnectServiceNodes_good - 2019/12/30 18:56:53.692595 [INFO] agent: Endpoints down
=== CONT  TestCatalogServiceNodes_Filter
--- PASS: TestCatalogConnectServiceNodes_good (6.77s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:53.770218 [WARN] agent: Node name "Node 325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:53.771233 [DEBUG] tlsutil: Update with version 1
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:53.773641 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.800458 [INFO] manager: shutting down
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.802682 [INFO] agent: consul server down
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.802752 [INFO] agent: shutdown complete
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.802819 [INFO] agent: Stopping DNS server 127.0.0.1:18683 (tcp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.803017 [INFO] agent: Stopping DNS server 127.0.0.1:18683 (udp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.803279 [INFO] agent: Stopping HTTP server 127.0.0.1:18684 (tcp)
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.803553 [INFO] agent: Waiting for endpoints to shut down
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.803635 [INFO] agent: Endpoints down
--- PASS: TestCatalogConnectServiceNodes_Filter (7.68s)
=== CONT  TestCatalogServiceNodes_NodeMetaFilter
TestCatalogConnectServiceNodes_Filter - 2019/12/30 18:56:53.822742 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:54.085597 [WARN] agent: Node name "Node 4be40cce-ff23-7be5-0921-b49a6636e32b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:54.086033 [DEBUG] tlsutil: Update with version 1
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:54.089014 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:54 [INFO]  raft: Node at 127.0.0.1:18700 [Leader] entering Leader state
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:54.270493 [INFO] consul: cluster leadership acquired
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:54.270945 [INFO] consul: New leader elected: Node 01e5a8b9-abbf-383f-a6a9-b3a33d5168d5
2019/12/30 18:56:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9b42bd13-0b35-1566-2adb-46af2525a488 Address:127.0.0.1:18706}]
2019/12/30 18:56:54 [INFO]  raft: Node at 127.0.0.1:18706 [Follower] entering Follower state (Leader: "")
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.360151 [INFO] serf: EventMemberJoin: Node 9b42bd13-0b35-1566-2adb-46af2525a488.dc1 127.0.0.1
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.366330 [INFO] serf: EventMemberJoin: Node 9b42bd13-0b35-1566-2adb-46af2525a488 127.0.0.1
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.367733 [INFO] consul: Adding LAN server Node 9b42bd13-0b35-1566-2adb-46af2525a488 (Addr: tcp/127.0.0.1:18706) (DC: dc1)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.368445 [INFO] consul: Handled member-join event for server "Node 9b42bd13-0b35-1566-2adb-46af2525a488.dc1" in area "wan"
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.370912 [INFO] agent: Started DNS server 127.0.0.1:18701 (tcp)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.371503 [INFO] agent: Started DNS server 127.0.0.1:18701 (udp)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.377151 [INFO] agent: Started HTTP server on 127.0.0.1:18702 (tcp)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:54.377632 [INFO] agent: started state syncer
2019/12/30 18:56:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:54 [INFO]  raft: Node at 127.0.0.1:18706 [Candidate] entering Candidate state in term 2
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:54.669564 [INFO] agent: Synced node info
2019/12/30 18:56:55 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:55 [INFO]  raft: Node at 127.0.0.1:18706 [Leader] entering Leader state
2019/12/30 18:56:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7 Address:127.0.0.1:18712}]
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.095961 [INFO] agent: Requesting shutdown
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.096069 [INFO] consul: shutting down server
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.096131 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.097379 [INFO] serf: EventMemberJoin: Node 325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7.dc1 127.0.0.1
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:55.098174 [INFO] consul: cluster leadership acquired
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:55.098600 [INFO] consul: New leader elected: Node 9b42bd13-0b35-1566-2adb-46af2525a488
2019/12/30 18:56:55 [INFO]  raft: Node at 127.0.0.1:18712 [Follower] entering Follower state (Leader: "")
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.101559 [INFO] serf: EventMemberJoin: Node 325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7 127.0.0.1
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.102390 [INFO] consul: Adding LAN server Node 325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7 (Addr: tcp/127.0.0.1:18712) (DC: dc1)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.102679 [INFO] consul: Handled member-join event for server "Node 325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7.dc1" in area "wan"
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.102787 [INFO] agent: Started DNS server 127.0.0.1:18707 (udp)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.103119 [INFO] agent: Started DNS server 127.0.0.1:18707 (tcp)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.105484 [INFO] agent: Started HTTP server on 127.0.0.1:18708 (tcp)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.105575 [INFO] agent: started state syncer
2019/12/30 18:56:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:55 [INFO]  raft: Node at 127.0.0.1:18712 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4be40cce-ff23-7be5-0921-b49a6636e32b Address:127.0.0.1:18718}]
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.203555 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:55 [INFO]  raft: Node at 127.0.0.1:18718 [Follower] entering Follower state (Leader: "")
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.206911 [INFO] serf: EventMemberJoin: Node 4be40cce-ff23-7be5-0921-b49a6636e32b.dc1 127.0.0.1
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.210709 [INFO] serf: EventMemberJoin: Node 4be40cce-ff23-7be5-0921-b49a6636e32b 127.0.0.1
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.211598 [INFO] consul: Handled member-join event for server "Node 4be40cce-ff23-7be5-0921-b49a6636e32b.dc1" in area "wan"
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.211887 [INFO] consul: Adding LAN server Node 4be40cce-ff23-7be5-0921-b49a6636e32b (Addr: tcp/127.0.0.1:18718) (DC: dc1)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.212402 [INFO] agent: Started DNS server 127.0.0.1:18713 (tcp)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.212466 [INFO] agent: Started DNS server 127.0.0.1:18713 (udp)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.215207 [INFO] agent: Started HTTP server on 127.0.0.1:18714 (tcp)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:55.215304 [INFO] agent: started state syncer
2019/12/30 18:56:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:55 [INFO]  raft: Node at 127.0.0.1:18718 [Candidate] entering Candidate state in term 2
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.319309 [INFO] manager: shutting down
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410285 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410499 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410566 [INFO] agent: consul server down
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410616 [INFO] agent: shutdown complete
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410689 [INFO] agent: Stopping DNS server 127.0.0.1:18695 (tcp)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410568 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.410842 [INFO] agent: Stopping DNS server 127.0.0.1:18695 (udp)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.411008 [INFO] agent: Stopping HTTP server 127.0.0.1:18696 (tcp)
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.411220 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServiceNodes_ConnectProxy - 2019/12/30 18:56:55.411293 [INFO] agent: Endpoints down
--- PASS: TestCatalogServiceNodes_ConnectProxy (3.93s)
=== CONT  TestCatalogServiceNodes
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServiceNodes - 2019/12/30 18:56:55.515070 [WARN] agent: Node name "Node 2053a144-8fd8-2ab6-2242-486dc989d7a5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServiceNodes - 2019/12/30 18:56:55.515470 [DEBUG] tlsutil: Update with version 1
TestCatalogServiceNodes - 2019/12/30 18:56:55.527973 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:56:55.819063 [DEBUG] consul: Skipping self join check for "Node a8b3e297-b53a-bcd0-efda-5addcd938805" since the cluster is too small
2019/12/30 18:56:55 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:55 [INFO]  raft: Node at 127.0.0.1:18712 [Leader] entering Leader state
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.919358 [INFO] consul: cluster leadership acquired
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:55.919823 [INFO] consul: New leader elected: Node 325b8e5e-72f0-9eff-b6ae-b87fa82ab8e7
2019/12/30 18:56:56 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:56 [INFO]  raft: Node at 127.0.0.1:18718 [Leader] entering Leader state
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.022134 [INFO] consul: cluster leadership acquired
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:56.022549 [INFO] agent: Synced node info
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:56.022650 [DEBUG] agent: Node info in sync
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.022565 [INFO] consul: New leader elected: Node 4be40cce-ff23-7be5-0921-b49a6636e32b
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:56.522234 [INFO] agent: Synced node info
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:56.522347 [DEBUG] agent: Node info in sync
2019/12/30 18:56:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2053a144-8fd8-2ab6-2242-486dc989d7a5 Address:127.0.0.1:18724}]
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.604534 [INFO] agent: Requesting shutdown
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.604620 [INFO] consul: shutting down server
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.604664 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.606060 [INFO] agent: Synced node info
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.606180 [DEBUG] agent: Node info in sync
2019/12/30 18:56:56 [INFO]  raft: Node at 127.0.0.1:18724 [Follower] entering Follower state (Leader: "")
TestCatalogServiceNodes - 2019/12/30 18:56:56.609164 [INFO] serf: EventMemberJoin: Node 2053a144-8fd8-2ab6-2242-486dc989d7a5.dc1 127.0.0.1
TestCatalogServiceNodes - 2019/12/30 18:56:56.612836 [INFO] serf: EventMemberJoin: Node 2053a144-8fd8-2ab6-2242-486dc989d7a5 127.0.0.1
TestCatalogServiceNodes - 2019/12/30 18:56:56.614887 [INFO] consul: Adding LAN server Node 2053a144-8fd8-2ab6-2242-486dc989d7a5 (Addr: tcp/127.0.0.1:18724) (DC: dc1)
TestCatalogServiceNodes - 2019/12/30 18:56:56.615275 [INFO] consul: Handled member-join event for server "Node 2053a144-8fd8-2ab6-2242-486dc989d7a5.dc1" in area "wan"
TestCatalogServiceNodes - 2019/12/30 18:56:56.616355 [INFO] agent: Started DNS server 127.0.0.1:18719 (tcp)
TestCatalogServiceNodes - 2019/12/30 18:56:56.616721 [INFO] agent: Started DNS server 127.0.0.1:18719 (udp)
TestCatalogServiceNodes - 2019/12/30 18:56:56.619166 [INFO] agent: Started HTTP server on 127.0.0.1:18720 (tcp)
TestCatalogServiceNodes - 2019/12/30 18:56:56.619275 [INFO] agent: started state syncer
2019/12/30 18:56:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:56 [INFO]  raft: Node at 127.0.0.1:18724 [Candidate] entering Candidate state in term 2
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.711925 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.885351 [INFO] manager: shutting down
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.983895 [INFO] agent: consul server down
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.983971 [INFO] agent: shutdown complete
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.984026 [INFO] agent: Stopping DNS server 127.0.0.1:18713 (tcp)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.984190 [INFO] agent: Stopping DNS server 127.0.0.1:18713 (udp)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.984363 [INFO] agent: Stopping HTTP server 127.0.0.1:18714 (tcp)
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.984683 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.984758 [INFO] agent: Endpoints down
--- PASS: TestCatalogServiceNodes_NodeMetaFilter (3.18s)
=== CONT  TestCatalogServices_NodeMetaFilter
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.990272 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.990631 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCatalogServiceNodes_NodeMetaFilter - 2019/12/30 18:56:56.990859 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:57.074083 [WARN] agent: Node name "Node 48a2dc88-e798-c179-d6ca-726ce52d659b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:57.075510 [DEBUG] tlsutil: Update with version 1
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:57.078421 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.088355 [INFO] agent: Requesting shutdown
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.088470 [INFO] consul: shutting down server
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.088519 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.214357 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.292665 [INFO] agent: Requesting shutdown
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.292772 [INFO] consul: shutting down server
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.292818 [WARN] serf: Shutdown without a Leave
2019/12/30 18:56:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:57 [INFO]  raft: Node at 127.0.0.1:18724 [Leader] entering Leader state
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.414720 [INFO] manager: shutting down
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.414790 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes - 2019/12/30 18:56:57.414890 [INFO] consul: cluster leadership acquired
TestCatalogServiceNodes - 2019/12/30 18:56:57.415259 [INFO] consul: New leader elected: Node 2053a144-8fd8-2ab6-2242-486dc989d7a5
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.537055 [INFO] manager: shutting down
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.660604 [INFO] agent: consul server down
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.660678 [INFO] agent: shutdown complete
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.660742 [INFO] agent: Stopping DNS server 127.0.0.1:18701 (tcp)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.660893 [INFO] agent: Stopping DNS server 127.0.0.1:18701 (udp)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.661073 [WARN] consul.coordinate: Batch update failed: leadership lost while committing log
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.661081 [INFO] agent: Stopping HTTP server 127.0.0.1:18702 (tcp)
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.661163 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.661396 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServiceNodes_DistanceSort - 2019/12/30 18:56:57.661468 [INFO] agent: Endpoints down
--- FAIL: TestCatalogServiceNodes_DistanceSort (4.68s)
    catalog_endpoint_test.go:922: bad: [0x48dfb00 0x48dfbc0]
=== CONT  TestCatalogServices
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662275 [INFO] agent: consul server down
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662326 [INFO] agent: shutdown complete
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662380 [INFO] agent: Stopping DNS server 127.0.0.1:18707 (tcp)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662535 [INFO] agent: Stopping DNS server 127.0.0.1:18707 (udp)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662701 [INFO] agent: Stopping HTTP server 127.0.0.1:18708 (tcp)
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662889 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.662953 [INFO] agent: Endpoints down
--- PASS: TestCatalogServiceNodes_Filter (3.97s)
=== CONT  TestCatalogNodes_DistanceSort
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.664018 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCatalogServiceNodes_Filter - 2019/12/30 18:56:57.664185 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:57.730683 [WARN] agent: Node name "Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:57.731245 [DEBUG] tlsutil: Update with version 1
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:57.733819 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogServices - 2019/12/30 18:56:57.741362 [WARN] agent: Node name "Node 238721a1-b862-86dd-2033-c31eef0c4d02" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogServices - 2019/12/30 18:56:57.741741 [DEBUG] tlsutil: Update with version 1
TestCatalogServices - 2019/12/30 18:56:57.743816 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogServiceNodes - 2019/12/30 18:56:57.928786 [INFO] agent: Synced node info
TestCatalogServiceNodes - 2019/12/30 18:56:57.928899 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:56:58.127491 [DEBUG] consul: Skipping self join check for "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" since the cluster is too small
2019/12/30 18:56:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:48a2dc88-e798-c179-d6ca-726ce52d659b Address:127.0.0.1:18730}]
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:18730 [Follower] entering Follower state (Leader: "")
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.350311 [INFO] serf: EventMemberJoin: Node 48a2dc88-e798-c179-d6ca-726ce52d659b.dc1 127.0.0.1
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.353525 [INFO] serf: EventMemberJoin: Node 48a2dc88-e798-c179-d6ca-726ce52d659b 127.0.0.1
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.357126 [INFO] consul: Handled member-join event for server "Node 48a2dc88-e798-c179-d6ca-726ce52d659b.dc1" in area "wan"
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.357279 [INFO] consul: Adding LAN server Node 48a2dc88-e798-c179-d6ca-726ce52d659b (Addr: tcp/127.0.0.1:18730) (DC: dc1)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.357943 [INFO] agent: Started DNS server 127.0.0.1:18725 (tcp)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.360842 [INFO] agent: Started DNS server 127.0.0.1:18725 (udp)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.363501 [INFO] agent: Started HTTP server on 127.0.0.1:18726 (tcp)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:58.363607 [INFO] agent: started state syncer
2019/12/30 18:56:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:18730 [Candidate] entering Candidate state in term 2
TestCatalogServiceNodes - 2019/12/30 18:56:58.488524 [INFO] agent: Requesting shutdown
TestCatalogServiceNodes - 2019/12/30 18:56:58.488616 [INFO] consul: shutting down server
TestCatalogServiceNodes - 2019/12/30 18:56:58.488666 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes - 2019/12/30 18:56:58.643767 [WARN] serf: Shutdown without a Leave
TestCatalogServiceNodes - 2019/12/30 18:56:58.713114 [INFO] manager: shutting down
TestCatalogServiceNodes - 2019/12/30 18:56:58.910676 [INFO] agent: consul server down
TestCatalogServiceNodes - 2019/12/30 18:56:58.910752 [INFO] agent: shutdown complete
TestCatalogServiceNodes - 2019/12/30 18:56:58.910812 [INFO] agent: Stopping DNS server 127.0.0.1:18719 (tcp)
2019/12/30 18:56:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a9dc6397-20c0-bbaa-2b73-d1e28107fd19 Address:127.0.0.1:18742}]
TestCatalogServiceNodes - 2019/12/30 18:56:58.910956 [INFO] agent: Stopping DNS server 127.0.0.1:18719 (udp)
TestCatalogServiceNodes - 2019/12/30 18:56:58.911156 [INFO] agent: Stopping HTTP server 127.0.0.1:18720 (tcp)
TestCatalogServiceNodes - 2019/12/30 18:56:58.911329 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
2019/12/30 18:56:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:238721a1-b862-86dd-2033-c31eef0c4d02 Address:127.0.0.1:18736}]
TestCatalogServiceNodes - 2019/12/30 18:56:58.911377 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServiceNodes - 2019/12/30 18:56:58.911712 [INFO] agent: Endpoints down
--- PASS: TestCatalogServiceNodes (3.50s)
=== CONT  TestCatalogNodes_Blocking
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.914077 [INFO] serf: EventMemberJoin: Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19.dc1 127.0.0.1
TestCatalogServices - 2019/12/30 18:56:58.915366 [INFO] serf: EventMemberJoin: Node 238721a1-b862-86dd-2033-c31eef0c4d02.dc1 127.0.0.1
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.919065 [INFO] serf: EventMemberJoin: Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19 127.0.0.1
TestCatalogServiceNodes - 2019/12/30 18:56:58.919570 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCatalogServiceNodes - 2019/12/30 18:56:58.919643 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:18742 [Follower] entering Follower state (Leader: "")
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:18736 [Follower] entering Follower state (Leader: "")
TestCatalogServices - 2019/12/30 18:56:58.922895 [INFO] serf: EventMemberJoin: Node 238721a1-b862-86dd-2033-c31eef0c4d02 127.0.0.1
TestCatalogServices - 2019/12/30 18:56:58.924051 [INFO] agent: Started DNS server 127.0.0.1:18731 (udp)
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.925440 [INFO] agent: Started DNS server 127.0.0.1:18737 (udp)
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.925846 [INFO] consul: Adding LAN server Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19 (Addr: tcp/127.0.0.1:18742) (DC: dc1)
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.926034 [INFO] consul: Handled member-join event for server "Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19.dc1" in area "wan"
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.926450 [INFO] agent: Started DNS server 127.0.0.1:18737 (tcp)
TestCatalogServices - 2019/12/30 18:56:58.928141 [INFO] consul: Adding LAN server Node 238721a1-b862-86dd-2033-c31eef0c4d02 (Addr: tcp/127.0.0.1:18736) (DC: dc1)
TestCatalogServices - 2019/12/30 18:56:58.928361 [INFO] consul: Handled member-join event for server "Node 238721a1-b862-86dd-2033-c31eef0c4d02.dc1" in area "wan"
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.928737 [INFO] agent: Started HTTP server on 127.0.0.1:18738 (tcp)
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:58.928817 [INFO] agent: started state syncer
TestCatalogServices - 2019/12/30 18:56:58.928859 [INFO] agent: Started DNS server 127.0.0.1:18731 (tcp)
TestCatalogServices - 2019/12/30 18:56:58.931619 [INFO] agent: Started HTTP server on 127.0.0.1:18732 (tcp)
TestCatalogServices - 2019/12/30 18:56:58.931914 [INFO] agent: started state syncer
2019/12/30 18:56:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:18736 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:18742 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.018776 [WARN] agent: Node name "Node f3569cae-7eaf-a415-771f-b2203a9b1226" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.019166 [DEBUG] tlsutil: Update with version 1
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.021286 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:59 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:59 [INFO]  raft: Node at 127.0.0.1:18730 [Leader] entering Leader state
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.121475 [INFO] consul: cluster leadership acquired
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.121973 [INFO] consul: New leader elected: Node 48a2dc88-e798-c179-d6ca-726ce52d659b
2019/12/30 18:56:59 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:59 [INFO]  raft: Node at 127.0.0.1:18742 [Leader] entering Leader state
2019/12/30 18:56:59 [INFO]  raft: Election won. Tally: 1
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:59.520607 [INFO] consul: cluster leadership acquired
2019/12/30 18:56:59 [INFO]  raft: Node at 127.0.0.1:18736 [Leader] entering Leader state
TestCatalogNodes_DistanceSort - 2019/12/30 18:56:59.521006 [INFO] consul: New leader elected: Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19
TestCatalogServices - 2019/12/30 18:56:59.521227 [INFO] consul: cluster leadership acquired
TestCatalogServices - 2019/12/30 18:56:59.521620 [INFO] consul: New leader elected: Node 238721a1-b862-86dd-2033-c31eef0c4d02
2019/12/30 18:56:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f3569cae-7eaf-a415-771f-b2203a9b1226 Address:127.0.0.1:18748}]
2019/12/30 18:56:59 [INFO]  raft: Node at 127.0.0.1:18748 [Follower] entering Follower state (Leader: "")
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.934084 [INFO] serf: EventMemberJoin: Node f3569cae-7eaf-a415-771f-b2203a9b1226.dc1 127.0.0.1
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.934318 [INFO] agent: Requesting shutdown
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.934437 [INFO] consul: shutting down server
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.934494 [WARN] serf: Shutdown without a Leave
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.934792 [INFO] agent: Synced node info
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:56:59.934901 [DEBUG] agent: Node info in sync
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.937510 [INFO] serf: EventMemberJoin: Node f3569cae-7eaf-a415-771f-b2203a9b1226 127.0.0.1
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.938043 [INFO] consul: Handled member-join event for server "Node f3569cae-7eaf-a415-771f-b2203a9b1226.dc1" in area "wan"
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.938339 [INFO] consul: Adding LAN server Node f3569cae-7eaf-a415-771f-b2203a9b1226 (Addr: tcp/127.0.0.1:18748) (DC: dc1)
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.938642 [INFO] agent: Started DNS server 127.0.0.1:18743 (tcp)
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.938710 [INFO] agent: Started DNS server 127.0.0.1:18743 (udp)
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.941233 [INFO] agent: Started HTTP server on 127.0.0.1:18744 (tcp)
TestCatalogNodes_Blocking - 2019/12/30 18:56:59.941339 [INFO] agent: started state syncer
2019/12/30 18:56:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:59 [INFO]  raft: Node at 127.0.0.1:18748 [Candidate] entering Candidate state in term 2
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.070981 [WARN] serf: Shutdown without a Leave
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.168713 [INFO] manager: shutting down
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169178 [INFO] agent: consul server down
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169242 [INFO] agent: shutdown complete
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169303 [INFO] agent: Stopping DNS server 127.0.0.1:18725 (tcp)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169565 [INFO] agent: Stopping DNS server 127.0.0.1:18725 (udp)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169251 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169762 [INFO] agent: Stopping HTTP server 127.0.0.1:18726 (tcp)
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.169992 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServices_NodeMetaFilter - 2019/12/30 18:57:00.170069 [INFO] agent: Endpoints down
--- PASS: TestCatalogServices_NodeMetaFilter (3.19s)
=== CONT  TestCatalogNodes_Filter
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:00.174004 [INFO] agent: Synced node info
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:00.174148 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodes_Filter - 2019/12/30 18:57:00.236782 [WARN] agent: Node name "Node 346883d5-3caf-4c93-6cd0-9c48970e01ff" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodes_Filter - 2019/12/30 18:57:00.237211 [DEBUG] tlsutil: Update with version 1
TestCatalogNodes_Filter - 2019/12/30 18:57:00.239821 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogServices - 2019/12/30 18:57:00.390104 [INFO] agent: Synced node info
TestCatalogServices - 2019/12/30 18:57:00.390244 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:57:00.391760 [DEBUG] consul: Skipping self join check for "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" since the cluster is too small
2019/12/30 18:57:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:00 [INFO]  raft: Node at 127.0.0.1:18748 [Leader] entering Leader state
TestCatalogNodes_Blocking - 2019/12/30 18:57:00.655877 [INFO] consul: cluster leadership acquired
TestCatalogNodes_Blocking - 2019/12/30 18:57:00.656327 [INFO] consul: New leader elected: Node f3569cae-7eaf-a415-771f-b2203a9b1226
TestCatalogNodes_Blocking - 2019/12/30 18:57:01.119944 [INFO] agent: Synced node info
TestCatalogNodes_Blocking - 2019/12/30 18:57:01.120138 [DEBUG] agent: Node info in sync
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:01.394861 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:01.395435 [DEBUG] consul: Skipping self join check for "Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19" since the cluster is too small
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:01.395697 [INFO] consul: member 'Node a9dc6397-20c0-bbaa-2b73-d1e28107fd19' joined, marking health alive
2019/12/30 18:57:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:346883d5-3caf-4c93-6cd0-9c48970e01ff Address:127.0.0.1:18754}]
2019/12/30 18:57:01 [INFO]  raft: Node at 127.0.0.1:18754 [Follower] entering Follower state (Leader: "")
TestCatalogServices - 2019/12/30 18:57:01.398871 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogServices - 2019/12/30 18:57:01.399251 [DEBUG] consul: Skipping self join check for "Node 238721a1-b862-86dd-2033-c31eef0c4d02" since the cluster is too small
TestCatalogServices - 2019/12/30 18:57:01.399468 [INFO] consul: member 'Node 238721a1-b862-86dd-2033-c31eef0c4d02' joined, marking health alive
TestCatalogNodes_Filter - 2019/12/30 18:57:01.407923 [INFO] serf: EventMemberJoin: Node 346883d5-3caf-4c93-6cd0-9c48970e01ff.dc1 127.0.0.1
TestCatalogNodes_Filter - 2019/12/30 18:57:01.412536 [INFO] serf: EventMemberJoin: Node 346883d5-3caf-4c93-6cd0-9c48970e01ff 127.0.0.1
TestCatalogNodes_Filter - 2019/12/30 18:57:01.413825 [INFO] consul: Adding LAN server Node 346883d5-3caf-4c93-6cd0-9c48970e01ff (Addr: tcp/127.0.0.1:18754) (DC: dc1)
TestCatalogNodes_Filter - 2019/12/30 18:57:01.414188 [INFO] consul: Handled member-join event for server "Node 346883d5-3caf-4c93-6cd0-9c48970e01ff.dc1" in area "wan"
TestCatalogNodes_Filter - 2019/12/30 18:57:01.414881 [INFO] agent: Started DNS server 127.0.0.1:18749 (tcp)
TestCatalogNodes_Filter - 2019/12/30 18:57:01.415392 [INFO] agent: Started DNS server 127.0.0.1:18749 (udp)
TestCatalogNodes_Filter - 2019/12/30 18:57:01.418771 [INFO] agent: Started HTTP server on 127.0.0.1:18750 (tcp)
TestCatalogNodes_Filter - 2019/12/30 18:57:01.418863 [INFO] agent: started state syncer
2019/12/30 18:57:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:01 [INFO]  raft: Node at 127.0.0.1:18754 [Candidate] entering Candidate state in term 2
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:01.564003 [DEBUG] agent: Node info in sync
TestCatalogServices - 2019/12/30 18:57:01.834213 [INFO] agent: Requesting shutdown
TestCatalogServices - 2019/12/30 18:57:01.834352 [INFO] consul: shutting down server
TestCatalogServices - 2019/12/30 18:57:01.834502 [WARN] serf: Shutdown without a Leave
TestCatalogServices - 2019/12/30 18:57:01.911974 [WARN] serf: Shutdown without a Leave
TestCatalogServices - 2019/12/30 18:57:01.941488 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestCatalogServices - 2019/12/30 18:57:01.941565 [DEBUG] agent: Node info in sync
TestCatalogServices - 2019/12/30 18:57:02.003078 [INFO] manager: shutting down
TestCatalogServices - 2019/12/30 18:57:02.004731 [INFO] agent: consul server down
TestCatalogServices - 2019/12/30 18:57:02.004802 [INFO] agent: shutdown complete
TestCatalogServices - 2019/12/30 18:57:02.004873 [INFO] agent: Stopping DNS server 127.0.0.1:18731 (tcp)
TestCatalogServices - 2019/12/30 18:57:02.005048 [INFO] agent: Stopping DNS server 127.0.0.1:18731 (udp)
TestCatalogServices - 2019/12/30 18:57:02.005249 [INFO] agent: Stopping HTTP server 127.0.0.1:18732 (tcp)
TestCatalogServices - 2019/12/30 18:57:02.005482 [INFO] agent: Waiting for endpoints to shut down
TestCatalogServices - 2019/12/30 18:57:02.005564 [INFO] agent: Endpoints down
--- PASS: TestCatalogServices (4.34s)
=== CONT  TestCatalogNodes_MetaFilter
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:02.072429 [WARN] agent: Node name "Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:02.072902 [DEBUG] tlsutil: Update with version 1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:02.075492 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:02 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:02 [INFO]  raft: Node at 127.0.0.1:18754 [Leader] entering Leader state
TestCatalogNodes_Filter - 2019/12/30 18:57:02.138475 [INFO] consul: cluster leadership acquired
TestCatalogNodes_Filter - 2019/12/30 18:57:02.138876 [INFO] consul: New leader elected: Node 346883d5-3caf-4c93-6cd0-9c48970e01ff
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.305191 [INFO] agent: Requesting shutdown
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.305279 [INFO] consul: shutting down server
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.305388 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.372902 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_Blocking - 2019/12/30 18:57:02.373795 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodes_Blocking - 2019/12/30 18:57:02.374198 [DEBUG] consul: Skipping self join check for "Node f3569cae-7eaf-a415-771f-b2203a9b1226" since the cluster is too small
TestCatalogNodes_Blocking - 2019/12/30 18:57:02.374354 [INFO] consul: member 'Node f3569cae-7eaf-a415-771f-b2203a9b1226' joined, marking health alive
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.453724 [INFO] manager: shutting down
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.454358 [INFO] agent: consul server down
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.454484 [INFO] agent: shutdown complete
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.454556 [INFO] agent: Stopping DNS server 127.0.0.1:18737 (tcp)
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.454740 [INFO] agent: Stopping DNS server 127.0.0.1:18737 (udp)
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.454938 [INFO] agent: Stopping HTTP server 127.0.0.1:18738 (tcp)
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.455183 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodes_DistanceSort - 2019/12/30 18:57:02.455268 [INFO] agent: Endpoints down
--- FAIL: TestCatalogNodes_DistanceSort (4.79s)
    catalog_endpoint_test.go:437: bad: [0x5b0f440 0x56e5600 0x58b8840]
=== CONT  TestCatalogNodes
jones - 2019/12/30 18:57:02.464869 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:02.464949 [DEBUG] agent: Node info in sync
TestCatalogNodes_Filter - 2019/12/30 18:57:02.466560 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogNodes - 2019/12/30 18:57:02.515760 [WARN] agent: Node name "Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodes - 2019/12/30 18:57:02.516706 [DEBUG] tlsutil: Update with version 1
TestCatalogNodes - 2019/12/30 18:57:02.518809 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodes_Filter - 2019/12/30 18:57:02.560746 [DEBUG] agent: Node info in sync
TestCatalogNodes_Filter - 2019/12/30 18:57:02.560857 [DEBUG] agent: Node info in sync
TestCatalogNodes_Blocking - 2019/12/30 18:57:02.964966 [INFO] agent: Requesting shutdown
TestCatalogNodes_Blocking - 2019/12/30 18:57:02.965057 [INFO] consul: shutting down server
TestCatalogNodes_Blocking - 2019/12/30 18:57:02.965166 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.054331 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.239536 [INFO] manager: shutting down
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240002 [INFO] agent: consul server down
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240059 [INFO] agent: shutdown complete
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240118 [INFO] agent: Stopping DNS server 127.0.0.1:18743 (tcp)
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240259 [INFO] agent: Stopping DNS server 127.0.0.1:18743 (udp)
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240411 [INFO] agent: Stopping HTTP server 127.0.0.1:18744 (tcp)
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240603 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodes_Blocking - 2019/12/30 18:57:03.240675 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodes_Blocking (4.33s)
=== CONT  TestCatalogDatacenters
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogDatacenters - 2019/12/30 18:57:03.317299 [WARN] agent: Node name "Node d0ae64e5-33b6-393d-c71f-b42fc8852ac2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogDatacenters - 2019/12/30 18:57:03.317941 [DEBUG] tlsutil: Update with version 1
TestCatalogDatacenters - 2019/12/30 18:57:03.322107 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:03.334751 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:03.334830 [DEBUG] agent: Node info in sync
2019/12/30 18:57:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:349b7b00-0c1e-2d8b-65f3-c258a5daa723 Address:127.0.0.1:18760}]
2019/12/30 18:57:03 [INFO]  raft: Node at 127.0.0.1:18760 [Follower] entering Follower state (Leader: "")
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.421570 [INFO] serf: EventMemberJoin: Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723.dc1 127.0.0.1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.424961 [INFO] serf: EventMemberJoin: Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723 127.0.0.1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.425547 [INFO] consul: Handled member-join event for server "Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723.dc1" in area "wan"
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.425927 [INFO] consul: Adding LAN server Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723 (Addr: tcp/127.0.0.1:18760) (DC: dc1)
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.426150 [INFO] agent: Started DNS server 127.0.0.1:18755 (tcp)
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.426231 [INFO] agent: Started DNS server 127.0.0.1:18755 (udp)
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.428703 [INFO] agent: Started HTTP server on 127.0.0.1:18756 (tcp)
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:03.428804 [INFO] agent: started state syncer
2019/12/30 18:57:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:03 [INFO]  raft: Node at 127.0.0.1:18760 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d4b43436-7132-fa9f-56e4-0e3c3ae72bfd Address:127.0.0.1:18766}]
2019/12/30 18:57:03 [INFO]  raft: Node at 127.0.0.1:18766 [Follower] entering Follower state (Leader: "")
TestCatalogNodes - 2019/12/30 18:57:03.815984 [INFO] serf: EventMemberJoin: Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd.dc1 127.0.0.1
TestCatalogNodes - 2019/12/30 18:57:03.822016 [INFO] serf: EventMemberJoin: Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd 127.0.0.1
TestCatalogNodes - 2019/12/30 18:57:03.823328 [INFO] agent: Started DNS server 127.0.0.1:18761 (udp)
TestCatalogNodes - 2019/12/30 18:57:03.824005 [INFO] consul: Adding LAN server Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd (Addr: tcp/127.0.0.1:18766) (DC: dc1)
TestCatalogNodes - 2019/12/30 18:57:03.824241 [INFO] consul: Handled member-join event for server "Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd.dc1" in area "wan"
TestCatalogNodes - 2019/12/30 18:57:03.824864 [INFO] agent: Started DNS server 127.0.0.1:18761 (tcp)
TestCatalogNodes - 2019/12/30 18:57:03.827361 [INFO] agent: Started HTTP server on 127.0.0.1:18762 (tcp)
TestCatalogNodes - 2019/12/30 18:57:03.827465 [INFO] agent: started state syncer
2019/12/30 18:57:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:03 [INFO]  raft: Node at 127.0.0.1:18766 [Candidate] entering Candidate state in term 2
TestCatalogNodes_Filter - 2019/12/30 18:57:03.930295 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodes_Filter - 2019/12/30 18:57:03.930799 [DEBUG] consul: Skipping self join check for "Node 346883d5-3caf-4c93-6cd0-9c48970e01ff" since the cluster is too small
TestCatalogNodes_Filter - 2019/12/30 18:57:03.930967 [INFO] consul: member 'Node 346883d5-3caf-4c93-6cd0-9c48970e01ff' joined, marking health alive
2019/12/30 18:57:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:04 [INFO]  raft: Node at 127.0.0.1:18760 [Leader] entering Leader state
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:04.122944 [INFO] consul: cluster leadership acquired
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:04.123428 [INFO] consul: New leader elected: Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723
TestCatalogNodes_Filter - 2019/12/30 18:57:04.364741 [INFO] agent: Requesting shutdown
TestCatalogNodes_Filter - 2019/12/30 18:57:04.364861 [INFO] consul: shutting down server
TestCatalogNodes_Filter - 2019/12/30 18:57:04.364918 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_Filter - 2019/12/30 18:57:04.444762 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_Filter - 2019/12/30 18:57:04.521633 [INFO] manager: shutting down
TestCatalogNodes_Filter - 2019/12/30 18:57:04.522231 [INFO] agent: consul server down
TestCatalogNodes_Filter - 2019/12/30 18:57:04.522295 [INFO] agent: shutdown complete
TestCatalogNodes_Filter - 2019/12/30 18:57:04.522351 [INFO] agent: Stopping DNS server 127.0.0.1:18749 (tcp)
TestCatalogNodes_Filter - 2019/12/30 18:57:04.522510 [INFO] agent: Stopping DNS server 127.0.0.1:18749 (udp)
TestCatalogNodes_Filter - 2019/12/30 18:57:04.522712 [INFO] agent: Stopping HTTP server 127.0.0.1:18750 (tcp)
TestCatalogNodes_Filter - 2019/12/30 18:57:04.522969 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodes_Filter - 2019/12/30 18:57:04.523050 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodes_Filter (4.35s)
=== CONT  TestCatalogDeregister
2019/12/30 18:57:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d0ae64e5-33b6-393d-c71f-b42fc8852ac2 Address:127.0.0.1:18772}]
2019/12/30 18:57:04 [INFO]  raft: Node at 127.0.0.1:18772 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:04 [INFO]  raft: Node at 127.0.0.1:18766 [Leader] entering Leader state
TestCatalogDatacenters - 2019/12/30 18:57:04.529009 [INFO] serf: EventMemberJoin: Node d0ae64e5-33b6-393d-c71f-b42fc8852ac2.dc1 127.0.0.1
TestCatalogNodes - 2019/12/30 18:57:04.537599 [INFO] consul: cluster leadership acquired
TestCatalogNodes - 2019/12/30 18:57:04.538058 [INFO] consul: New leader elected: Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd
2019/12/30 18:57:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:04 [INFO]  raft: Node at 127.0.0.1:18772 [Candidate] entering Candidate state in term 2
TestCatalogDatacenters - 2019/12/30 18:57:04.587834 [INFO] serf: EventMemberJoin: Node d0ae64e5-33b6-393d-c71f-b42fc8852ac2 127.0.0.1
TestCatalogDatacenters - 2019/12/30 18:57:04.589691 [INFO] consul: Adding LAN server Node d0ae64e5-33b6-393d-c71f-b42fc8852ac2 (Addr: tcp/127.0.0.1:18772) (DC: dc1)
TestCatalogDatacenters - 2019/12/30 18:57:04.590429 [INFO] consul: Handled member-join event for server "Node d0ae64e5-33b6-393d-c71f-b42fc8852ac2.dc1" in area "wan"
TestCatalogDatacenters - 2019/12/30 18:57:04.591798 [INFO] agent: Started DNS server 127.0.0.1:18767 (tcp)
TestCatalogDatacenters - 2019/12/30 18:57:04.592350 [INFO] agent: Started DNS server 127.0.0.1:18767 (udp)
TestCatalogDatacenters - 2019/12/30 18:57:04.597702 [INFO] agent: Started HTTP server on 127.0.0.1:18768 (tcp)
TestCatalogDatacenters - 2019/12/30 18:57:04.597829 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogDeregister - 2019/12/30 18:57:04.630630 [WARN] agent: Node name "Node 8a1e3daf-5832-ebe3-5826-c69c9eec4002" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogDeregister - 2019/12/30 18:57:04.631166 [DEBUG] tlsutil: Update with version 1
TestCatalogDeregister - 2019/12/30 18:57:04.633405 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:04.761691 [INFO] agent: Synced node info
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:04.761819 [DEBUG] agent: Node info in sync
TestCatalogNodes - 2019/12/30 18:57:05.012473 [INFO] agent: Synced node info
TestCatalogNodes - 2019/12/30 18:57:05.012611 [DEBUG] agent: Node info in sync
TestCatalogNodes - 2019/12/30 18:57:05.218833 [DEBUG] agent: Node info in sync
2019/12/30 18:57:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:05 [INFO]  raft: Node at 127.0.0.1:18772 [Leader] entering Leader state
TestCatalogDatacenters - 2019/12/30 18:57:05.475361 [INFO] consul: cluster leadership acquired
TestCatalogDatacenters - 2019/12/30 18:57:05.475833 [INFO] consul: New leader elected: Node d0ae64e5-33b6-393d-c71f-b42fc8852ac2
TestCatalogDatacenters - 2019/12/30 18:57:05.485537 [INFO] agent: Requesting shutdown
TestCatalogDatacenters - 2019/12/30 18:57:05.485641 [INFO] consul: shutting down server
TestCatalogDatacenters - 2019/12/30 18:57:05.485691 [WARN] serf: Shutdown without a Leave
TestCatalogDatacenters - 2019/12/30 18:57:05.696626 [WARN] serf: Shutdown without a Leave
TestCatalogDatacenters - 2019/12/30 18:57:05.796639 [INFO] manager: shutting down
jones - 2019/12/30 18:57:05.964061 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:05.964152 [DEBUG] agent: Node info in sync
TestCatalogDatacenters - 2019/12/30 18:57:05.994161 [INFO] agent: consul server down
TestCatalogDatacenters - 2019/12/30 18:57:05.994248 [INFO] agent: shutdown complete
TestCatalogDatacenters - 2019/12/30 18:57:05.994324 [INFO] agent: Stopping DNS server 127.0.0.1:18767 (tcp)
TestCatalogDatacenters - 2019/12/30 18:57:05.994615 [INFO] agent: Stopping DNS server 127.0.0.1:18767 (udp)
TestCatalogDatacenters - 2019/12/30 18:57:05.994805 [INFO] agent: Stopping HTTP server 127.0.0.1:18768 (tcp)
TestCatalogDatacenters - 2019/12/30 18:57:05.995042 [INFO] agent: Waiting for endpoints to shut down
TestCatalogDatacenters - 2019/12/30 18:57:05.995122 [INFO] agent: Endpoints down
2019/12/30 18:57:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8a1e3daf-5832-ebe3-5826-c69c9eec4002 Address:127.0.0.1:18778}]
--- PASS: TestCatalogDatacenters (2.75s)
=== CONT  TestCatalogRegister_Service_InvalidAddress
TestCatalogDeregister - 2019/12/30 18:57:05.998300 [INFO] serf: EventMemberJoin: Node 8a1e3daf-5832-ebe3-5826-c69c9eec4002.dc1 127.0.0.1
TestCatalogDatacenters - 2019/12/30 18:57:06.000621 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCatalogDatacenters - 2019/12/30 18:57:06.000793 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestCatalogDatacenters - 2019/12/30 18:57:06.000864 [ERR] agent: failed to sync remote state: leadership lost while committing log
2019/12/30 18:57:06 [INFO]  raft: Node at 127.0.0.1:18778 [Follower] entering Follower state (Leader: "")
TestCatalogDeregister - 2019/12/30 18:57:06.004079 [INFO] serf: EventMemberJoin: Node 8a1e3daf-5832-ebe3-5826-c69c9eec4002 127.0.0.1
TestCatalogDeregister - 2019/12/30 18:57:06.010163 [INFO] consul: Adding LAN server Node 8a1e3daf-5832-ebe3-5826-c69c9eec4002 (Addr: tcp/127.0.0.1:18778) (DC: dc1)
TestCatalogDeregister - 2019/12/30 18:57:06.010461 [INFO] consul: Handled member-join event for server "Node 8a1e3daf-5832-ebe3-5826-c69c9eec4002.dc1" in area "wan"
TestCatalogDeregister - 2019/12/30 18:57:06.012585 [INFO] agent: Started DNS server 127.0.0.1:18773 (udp)
TestCatalogDeregister - 2019/12/30 18:57:06.013768 [INFO] agent: Started DNS server 127.0.0.1:18773 (tcp)
TestCatalogDeregister - 2019/12/30 18:57:06.016224 [INFO] agent: Started HTTP server on 127.0.0.1:18774 (tcp)
TestCatalogDeregister - 2019/12/30 18:57:06.016326 [INFO] agent: started state syncer
2019/12/30 18:57:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:06 [INFO]  raft: Node at 127.0.0.1:18778 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:06.071425 [WARN] agent: Node name "Node 55e8c058-32c0-f0fb-b778-6c38f6d801f6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:06.072111 [DEBUG] tlsutil: Update with version 1
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:06.074863 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:06.269844 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:06.270409 [DEBUG] consul: Skipping self join check for "Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723" since the cluster is too small
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:06.270651 [INFO] consul: member 'Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723' joined, marking health alive
TestCatalogNodes - 2019/12/30 18:57:06.564186 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogNodes - 2019/12/30 18:57:06.564750 [DEBUG] consul: Skipping self join check for "Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd" since the cluster is too small
TestCatalogNodes - 2019/12/30 18:57:06.564955 [INFO] consul: member 'Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd' joined, marking health alive
2019/12/30 18:57:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:07 [INFO]  raft: Node at 127.0.0.1:18778 [Leader] entering Leader state
TestCatalogDeregister - 2019/12/30 18:57:07.189630 [INFO] consul: cluster leadership acquired
TestCatalogDeregister - 2019/12/30 18:57:07.190175 [INFO] consul: New leader elected: Node 8a1e3daf-5832-ebe3-5826-c69c9eec4002
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.215336 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.215584 [INFO] agent: Requesting shutdown
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.215666 [INFO] consul: shutting down server
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.215714 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.219733 [WARN] consul: error getting server health from "Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723": rpc error making call: EOF
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.352190 [WARN] serf: Shutdown without a Leave
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.385430 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.385516 [DEBUG] agent: Node info in sync
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.477345 [INFO] manager: shutting down
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.477865 [INFO] agent: consul server down
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.477930 [INFO] agent: shutdown complete
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.477986 [INFO] agent: Stopping DNS server 127.0.0.1:18755 (tcp)
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.478137 [INFO] agent: Stopping DNS server 127.0.0.1:18755 (udp)
TestCatalogNodes - 2019/12/30 18:57:07.478295 [INFO] agent: Requesting shutdown
TestCatalogNodes - 2019/12/30 18:57:07.478361 [INFO] consul: shutting down server
TestCatalogNodes - 2019/12/30 18:57:07.478411 [WARN] serf: Shutdown without a Leave
TestCatalogNodes - 2019/12/30 18:57:07.478606 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.478299 [INFO] agent: Stopping HTTP server 127.0.0.1:18756 (tcp)
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.478881 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:07.478962 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodes_MetaFilter (5.47s)
=== CONT  TestBlacklist
--- PASS: TestBlacklist (0.00s)
=== CONT  TestAgent_consulConfig
TestCatalogNodes - 2019/12/30 18:57:07.482384 [WARN] consul: error getting server health from "Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd": rpc error making call: EOF
TestCatalogNodes - 2019/12/30 18:57:07.552139 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_consulConfig - 2019/12/30 18:57:07.558804 [WARN] agent: Node name "Node ff1b6943-8439-5c26-94cd-21bf98fa146a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogNodes - 2019/12/30 18:57:07.685664 [INFO] manager: shutting down
2019/12/30 18:57:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:55e8c058-32c0-f0fb-b778-6c38f6d801f6 Address:127.0.0.1:18784}]
TestCatalogNodes - 2019/12/30 18:57:07.686153 [INFO] agent: consul server down
TestCatalogNodes - 2019/12/30 18:57:07.686203 [INFO] agent: shutdown complete
TestCatalogNodes - 2019/12/30 18:57:07.686260 [INFO] agent: Stopping DNS server 127.0.0.1:18761 (tcp)
TestCatalogNodes - 2019/12/30 18:57:07.686417 [INFO] agent: Stopping DNS server 127.0.0.1:18761 (udp)
TestCatalogNodes - 2019/12/30 18:57:07.686591 [INFO] agent: Stopping HTTP server 127.0.0.1:18762 (tcp)
TestCatalogNodes - 2019/12/30 18:57:07.686815 [INFO] agent: Waiting for endpoints to shut down
TestCatalogNodes - 2019/12/30 18:57:07.686902 [INFO] agent: Endpoints down
--- PASS: TestCatalogNodes (5.23s)
=== CONT  TestAgent_ReloadConfigTLSConfigFailure
2019/12/30 18:57:07 [INFO]  raft: Node at 127.0.0.1:18784 [Follower] entering Follower state (Leader: "")
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.690220 [INFO] serf: EventMemberJoin: Node 55e8c058-32c0-f0fb-b778-6c38f6d801f6.dc1 127.0.0.1
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.694571 [INFO] serf: EventMemberJoin: Node 55e8c058-32c0-f0fb-b778-6c38f6d801f6 127.0.0.1
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.695535 [INFO] consul: Adding LAN server Node 55e8c058-32c0-f0fb-b778-6c38f6d801f6 (Addr: tcp/127.0.0.1:18784) (DC: dc1)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.695851 [INFO] consul: Handled member-join event for server "Node 55e8c058-32c0-f0fb-b778-6c38f6d801f6.dc1" in area "wan"
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.696058 [INFO] agent: Started DNS server 127.0.0.1:18779 (udp)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.696344 [INFO] agent: Started DNS server 127.0.0.1:18779 (tcp)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.698772 [INFO] agent: Started HTTP server on 127.0.0.1:18780 (tcp)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:07.698975 [INFO] agent: started state syncer
2019/12/30 18:57:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:07 [INFO]  raft: Node at 127.0.0.1:18784 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:07.832654 [WARN] agent: Node name "Node 2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogDeregister - 2019/12/30 18:57:07.860167 [INFO] agent: Requesting shutdown
TestCatalogDeregister - 2019/12/30 18:57:07.860284 [INFO] consul: shutting down server
TestCatalogDeregister - 2019/12/30 18:57:07.860332 [WARN] serf: Shutdown without a Leave
TestCatalogDeregister - 2019/12/30 18:57:07.870455 [INFO] agent: Synced node info
TestCatalogDeregister - 2019/12/30 18:57:07.870582 [DEBUG] agent: Node info in sync
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:07.945923 [DEBUG] tlsutil: Update with version 1
TestAgent_consulConfig - 2019/12/30 18:57:07.947021 [DEBUG] tlsutil: Update with version 1
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:07.948271 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_consulConfig - 2019/12/30 18:57:07.949161 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogDeregister - 2019/12/30 18:57:08.043893 [WARN] serf: Shutdown without a Leave
TestCatalogDeregister - 2019/12/30 18:57:08.204327 [INFO] manager: shutting down
TestCatalogNodes_MetaFilter - 2019/12/30 18:57:08.215596 [WARN] consul: error getting server health from "Node 349b7b00-0c1e-2d8b-65f3-c258a5daa723": context deadline exceeded
TestCatalogNodes - 2019/12/30 18:57:08.478860 [WARN] consul: error getting server health from "Node d4b43436-7132-fa9f-56e4-0e3c3ae72bfd": context deadline exceeded
TestCatalogDeregister - 2019/12/30 18:57:08.536067 [INFO] agent: consul server down
TestCatalogDeregister - 2019/12/30 18:57:08.536180 [INFO] agent: shutdown complete
TestCatalogDeregister - 2019/12/30 18:57:08.536260 [INFO] agent: Stopping DNS server 127.0.0.1:18773 (tcp)
TestCatalogDeregister - 2019/12/30 18:57:08.536467 [INFO] agent: Stopping DNS server 127.0.0.1:18773 (udp)
TestCatalogDeregister - 2019/12/30 18:57:08.536694 [INFO] agent: Stopping HTTP server 127.0.0.1:18774 (tcp)
TestCatalogDeregister - 2019/12/30 18:57:08.536996 [INFO] agent: Waiting for endpoints to shut down
TestCatalogDeregister - 2019/12/30 18:57:08.537090 [INFO] agent: Endpoints down
--- PASS: TestCatalogDeregister (4.01s)
=== CONT  TestAgent_ReloadConfigIncomingRPCConfig
TestCatalogDeregister - 2019/12/30 18:57:08.537940 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCatalogDeregister - 2019/12/30 18:57:08.538182 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
2019/12/30 18:57:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:08 [INFO]  raft: Node at 127.0.0.1:18784 [Leader] entering Leader state
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:08.615621 [INFO] consul: cluster leadership acquired
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:08.616073 [INFO] consul: New leader elected: Node 55e8c058-32c0-f0fb-b778-6c38f6d801f6
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:08.734005 [INFO] agent: Requesting shutdown
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:08.734104 [INFO] consul: shutting down server
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:08.734175 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:08.804679 [WARN] agent: Node name "Node a91e19a9-ed71-0251-4a65-04a689ddb627" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:08.831922 [DEBUG] tlsutil: Update with version 1
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:08.835441 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:08.857529 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:57:08.951047 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:08.951152 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/30 18:57:08.951366 [DEBUG] agent: Node info in sync
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.019105 [INFO] manager: shutting down
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.102743 [INFO] agent: consul server down
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.102870 [INFO] agent: shutdown complete
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.102947 [INFO] agent: Stopping DNS server 127.0.0.1:18779 (tcp)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.103178 [INFO] agent: Stopping DNS server 127.0.0.1:18779 (udp)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.103398 [INFO] agent: Stopping HTTP server 127.0.0.1:18780 (tcp)
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.103684 [INFO] agent: Waiting for endpoints to shut down
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.103779 [INFO] agent: Endpoints down
--- PASS: TestCatalogRegister_Service_InvalidAddress (3.11s)
=== CONT  TestAgent_ReloadConfigOutgoingRPCConfig
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.105966 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.106054 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestCatalogRegister_Service_InvalidAddress - 2019/12/30 18:57:09.106222 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:09.234700 [WARN] agent: Node name "Node 04477938-1695-68eb-5b2a-418fef93fbf5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:09.240769 [DEBUG] tlsutil: Update with version 1
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:09.243615 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb Address:127.0.0.1:18796}]
2019/12/30 18:57:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ff1b6943-8439-5c26-94cd-21bf98fa146a Address:127.0.0.1:18790}]
2019/12/30 18:57:09 [INFO]  raft: Node at 127.0.0.1:18796 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:09 [INFO]  raft: Node at 127.0.0.1:18790 [Follower] entering Follower state (Leader: "")
TestAgent_consulConfig - 2019/12/30 18:57:09.320732 [DEBUG] tlsutil: UpdateAutoEncryptCA with version 2
TestAgent_consulConfig - 2019/12/30 18:57:09.321955 [INFO] serf: EventMemberJoin: Node ff1b6943-8439-5c26-94cd-21bf98fa146a.dc1 127.0.0.1
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.321955 [INFO] serf: EventMemberJoin: Node 2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb.dc1 127.0.0.1
TestAgent_consulConfig - 2019/12/30 18:57:09.326546 [INFO] serf: EventMemberJoin: Node ff1b6943-8439-5c26-94cd-21bf98fa146a 127.0.0.1
TestAgent_consulConfig - 2019/12/30 18:57:09.328134 [INFO] agent: Started DNS server 127.0.0.1:18785 (udp)
TestAgent_consulConfig - 2019/12/30 18:57:09.329629 [INFO] consul: Adding LAN server Node ff1b6943-8439-5c26-94cd-21bf98fa146a (Addr: tcp/127.0.0.1:18790) (DC: dc1)
TestAgent_consulConfig - 2019/12/30 18:57:09.330343 [INFO] agent: Started DNS server 127.0.0.1:18785 (tcp)
TestAgent_consulConfig - 2019/12/30 18:57:09.330836 [INFO] consul: Handled member-join event for server "Node ff1b6943-8439-5c26-94cd-21bf98fa146a.dc1" in area "wan"
TestAgent_consulConfig - 2019/12/30 18:57:09.333398 [INFO] agent: Started HTTP server on 127.0.0.1:18786 (tcp)
TestAgent_consulConfig - 2019/12/30 18:57:09.334354 [INFO] agent: started state syncer
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.336395 [INFO] serf: EventMemberJoin: Node 2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb 127.0.0.1
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.339195 [INFO] consul: Handled member-join event for server "Node 2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb.dc1" in area "wan"
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.339232 [INFO] consul: Adding LAN server Node 2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb (Addr: tcp/127.0.0.1:18796) (DC: dc1)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.344358 [INFO] agent: Started DNS server 127.0.0.1:18791 (tcp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.345341 [INFO] agent: Started DNS server 127.0.0.1:18791 (udp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.348026 [INFO] agent: Started HTTP server on 127.0.0.1:18792 (tcp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:09.348152 [INFO] agent: started state syncer
2019/12/30 18:57:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:09 [INFO]  raft: Node at 127.0.0.1:18796 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:09 [INFO]  raft: Node at 127.0.0.1:18790 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a91e19a9-ed71-0251-4a65-04a689ddb627 Address:127.0.0.1:18802}]
2019/12/30 18:57:11 [INFO]  raft: Node at 127.0.0.1:18802 [Follower] entering Follower state (Leader: "")
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.454696 [INFO] serf: EventMemberJoin: Node a91e19a9-ed71-0251-4a65-04a689ddb627.dc1 127.0.0.1
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.461184 [INFO] serf: EventMemberJoin: Node a91e19a9-ed71-0251-4a65-04a689ddb627 127.0.0.1
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.462154 [INFO] consul: Adding LAN server Node a91e19a9-ed71-0251-4a65-04a689ddb627 (Addr: tcp/127.0.0.1:18802) (DC: dc1)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.462429 [INFO] consul: Handled member-join event for server "Node a91e19a9-ed71-0251-4a65-04a689ddb627.dc1" in area "wan"
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.465521 [INFO] agent: Started DNS server 127.0.0.1:18797 (tcp)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.465621 [INFO] agent: Started DNS server 127.0.0.1:18797 (udp)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.478390 [INFO] agent: Started HTTP server on 127.0.0.1:18798 (tcp)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:11.478508 [INFO] agent: started state syncer
2019/12/30 18:57:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:11 [INFO]  raft: Node at 127.0.0.1:18802 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:11 [INFO]  raft: Node at 127.0.0.1:18796 [Leader] entering Leader state
2019/12/30 18:57:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:11 [INFO]  raft: Node at 127.0.0.1:18790 [Leader] entering Leader state
TestAgent_consulConfig - 2019/12/30 18:57:11.696012 [INFO] consul: cluster leadership acquired
TestAgent_consulConfig - 2019/12/30 18:57:11.696415 [INFO] consul: New leader elected: Node ff1b6943-8439-5c26-94cd-21bf98fa146a
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:11.696655 [INFO] consul: cluster leadership acquired
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:11.696981 [INFO] consul: New leader elected: Node 2f5f44dd-15c4-cd4d-2ef6-d926fb2dabfb
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:11.784253 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_consulConfig - 2019/12/30 18:57:11.885566 [INFO] agent: Requesting shutdown
TestAgent_consulConfig - 2019/12/30 18:57:11.885688 [INFO] consul: shutting down server
TestAgent_consulConfig - 2019/12/30 18:57:11.885742 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
2019/12/30 18:57:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:04477938-1695-68eb-5b2a-418fef93fbf5 Address:127.0.0.1:18808}]
2019/12/30 18:57:11 [INFO]  raft: Node at 127.0.0.1:18808 [Follower] entering Follower state (Leader: "")
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.957480 [INFO] serf: EventMemberJoin: Node 04477938-1695-68eb-5b2a-418fef93fbf5.dc1 127.0.0.1
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.963266 [INFO] serf: EventMemberJoin: Node 04477938-1695-68eb-5b2a-418fef93fbf5 127.0.0.1
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.964620 [INFO] consul: Adding LAN server Node 04477938-1695-68eb-5b2a-418fef93fbf5 (Addr: tcp/127.0.0.1:18808) (DC: dc1)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.964913 [INFO] consul: Handled member-join event for server "Node 04477938-1695-68eb-5b2a-418fef93fbf5.dc1" in area "wan"
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.965580 [INFO] agent: Started DNS server 127.0.0.1:18803 (udp)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.965675 [INFO] agent: Started DNS server 127.0.0.1:18803 (tcp)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.967999 [INFO] agent: Started HTTP server on 127.0.0.1:18804 (tcp)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:11.968110 [INFO] agent: started state syncer
2019/12/30 18:57:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:11 [INFO]  raft: Node at 127.0.0.1:18808 [Candidate] entering Candidate state in term 2
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.054321 [INFO] agent: Synced node info
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.054704 [DEBUG] tlsutil: Update with version 1
2019/12/30 18:57:12 [INFO]  raft: Election won. Tally: 1
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.054850 [DEBUG] tlsutil: IncomingRPCConfig with version 1
2019/12/30 18:57:12 [INFO]  raft: Node at 127.0.0.1:18802 [Leader] entering Leader state
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.055062 [INFO] agent: Requesting shutdown
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.055125 [INFO] consul: shutting down server
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.055170 [WARN] serf: Shutdown without a Leave
TestAgent_consulConfig - 2019/12/30 18:57:12.057195 [INFO] agent: Synced node info
TestAgent_consulConfig - 2019/12/30 18:57:12.057316 [DEBUG] agent: Node info in sync
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.057441 [INFO] consul: cluster leadership acquired
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.057863 [INFO] consul: New leader elected: Node a91e19a9-ed71-0251-4a65-04a689ddb627
TestAgent_consulConfig - 2019/12/30 18:57:12.059488 [WARN] serf: Shutdown without a Leave
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.170945 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.171159 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.212247 [WARN] serf: Shutdown without a Leave
TestAgent_consulConfig - 2019/12/30 18:57:12.212305 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_consulConfig - 2019/12/30 18:57:12.304111 [INFO] agent: consul server down
TestAgent_consulConfig - 2019/12/30 18:57:12.304201 [INFO] agent: shutdown complete
TestAgent_consulConfig - 2019/12/30 18:57:12.304292 [INFO] agent: Stopping DNS server 127.0.0.1:18785 (tcp)
TestAgent_consulConfig - 2019/12/30 18:57:12.304508 [INFO] agent: Stopping DNS server 127.0.0.1:18785 (udp)
TestAgent_consulConfig - 2019/12/30 18:57:12.304680 [INFO] agent: Stopping HTTP server 127.0.0.1:18786 (tcp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.304715 [INFO] manager: shutting down
TestAgent_consulConfig - 2019/12/30 18:57:12.304876 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_consulConfig - 2019/12/30 18:57:12.304877 [INFO] agent: Waiting for endpoints to shut down
TestAgent_consulConfig - 2019/12/30 18:57:12.305018 [INFO] agent: Endpoints down
TestAgent_consulConfig - 2019/12/30 18:57:12.305163 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_consulConfig - 2019/12/30 18:57:12.305320 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_consulConfig - 2019/12/30 18:57:12.305388 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestAgent_consulConfig - 2019/12/30 18:57:12.305437 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestAgent_consulConfig - 2019/12/30 18:57:12.305483 [ERR] consul: failed to transfer leadership in 3 attempts
--- PASS: TestAgent_consulConfig (4.83s)
=== CONT  TestAgent_loadTokens
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadTokens - 2019/12/30 18:57:12.364335 [WARN] agent: Node name "Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadTokens - 2019/12/30 18:57:12.364804 [DEBUG] tlsutil: Update with version 1
TestAgent_loadTokens - 2019/12/30 18:57:12.366940 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.594839 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595030 [INFO] agent: consul server down
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595087 [INFO] agent: shutdown complete
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.595122 [INFO] agent: Synced node info
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595146 [INFO] agent: Stopping DNS server 127.0.0.1:18791 (tcp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595305 [INFO] agent: Stopping DNS server 127.0.0.1:18791 (udp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595486 [INFO] agent: Stopping HTTP server 127.0.0.1:18792 (tcp)
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595726 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReloadConfigTLSConfigFailure - 2019/12/30 18:57:12.595796 [INFO] agent: Endpoints down
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.602411 [DEBUG] tlsutil: Update with version 2
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.603027 [DEBUG] tlsutil: IncomingRPCConfig with version 2
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.603259 [INFO] agent: Requesting shutdown
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.603746 [INFO] consul: shutting down server
--- PASS: TestAgent_ReloadConfigTLSConfigFailure (4.92s)
=== CONT  TestAgent_SetupProxyManager
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.603902 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.668958 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:12 [INFO]  raft: Node at 127.0.0.1:18808 [Leader] entering Leader state
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:12.688397 [INFO] consul: cluster leadership acquired
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:12.689069 [INFO] consul: New leader elected: Node 04477938-1695-68eb-5b2a-418fef93fbf5
WARNING: bootstrap = true: do not enable unless necessary
--- PASS: TestAgent_SetupProxyManager (0.12s)
=== CONT  TestAgent_ReLoadProxiesFromConfig
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.744096 [INFO] manager: shutting down
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.744872 [INFO] agent: consul server down
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.744939 [INFO] agent: shutdown complete
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.745017 [INFO] agent: Stopping DNS server 127.0.0.1:18797 (tcp)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.745187 [INFO] agent: Stopping DNS server 127.0.0.1:18797 (udp)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.745365 [INFO] agent: Stopping HTTP server 127.0.0.1:18798 (tcp)
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.745629 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.745713 [INFO] agent: Endpoints down
--- PASS: TestAgent_ReloadConfigIncomingRPCConfig (4.21s)
=== CONT  TestAgent_RemoveProxy
TestAgent_ReloadConfigIncomingRPCConfig - 2019/12/30 18:57:12.756213 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_RemoveProxy - 2019/12/30 18:57:12.806531 [DEBUG] tlsutil: Update with version 1
TestAgent_RemoveProxy - 2019/12/30 18:57:12.808830 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:12.819203 [DEBUG] tlsutil: Update with version 1
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:12.821400 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:12.879795 [DEBUG] tlsutil: OutgoingRPCConfig with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.061937 [INFO] agent: Synced node info
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.070185 [DEBUG] tlsutil: Update with version 2
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.070940 [DEBUG] tlsutil: OutgoingRPCConfig with version 2
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.071124 [INFO] agent: Requesting shutdown
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.071203 [INFO] consul: shutting down server
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.071306 [WARN] serf: Shutdown without a Leave
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.145637 [WARN] serf: Shutdown without a Leave
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.221772 [INFO] manager: shutting down
2019/12/30 18:57:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe Address:127.0.0.1:18814}]
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18814 [Follower] entering Follower state (Leader: "")
TestAgent_loadTokens - 2019/12/30 18:57:13.322932 [INFO] serf: EventMemberJoin: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe.dc1 127.0.0.1
TestAgent_loadTokens - 2019/12/30 18:57:13.326920 [INFO] serf: EventMemberJoin: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe 127.0.0.1
TestAgent_loadTokens - 2019/12/30 18:57:13.327616 [INFO] consul: Adding LAN server Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe (Addr: tcp/127.0.0.1:18814) (DC: dc1)
TestAgent_loadTokens - 2019/12/30 18:57:13.327884 [INFO] consul: Handled member-join event for server "Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe.dc1" in area "wan"
TestAgent_loadTokens - 2019/12/30 18:57:13.328453 [INFO] agent: Started DNS server 127.0.0.1:18809 (tcp)
TestAgent_loadTokens - 2019/12/30 18:57:13.328526 [INFO] agent: Started DNS server 127.0.0.1:18809 (udp)
TestAgent_loadTokens - 2019/12/30 18:57:13.330959 [INFO] agent: Started HTTP server on 127.0.0.1:18810 (tcp)
TestAgent_loadTokens - 2019/12/30 18:57:13.331057 [INFO] agent: started state syncer
2019/12/30 18:57:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18814 [Candidate] entering Candidate state in term 2
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461024 [INFO] agent: consul server down
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461116 [INFO] agent: shutdown complete
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461186 [INFO] agent: Stopping DNS server 127.0.0.1:18803 (tcp)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461380 [INFO] agent: Stopping DNS server 127.0.0.1:18803 (udp)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461570 [INFO] agent: Stopping HTTP server 127.0.0.1:18804 (tcp)
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461843 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.461935 [INFO] agent: Endpoints down
--- PASS: TestAgent_ReloadConfigOutgoingRPCConfig (4.36s)
=== CONT  TestAgent_reloadWatchesHTTPS
TestAgent_ReloadConfigOutgoingRPCConfig - 2019/12/30 18:57:13.466155 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:13.521389 [WARN] agent: Node name "Node 3f3822cc-8a95-bca3-bb75-f8ddff468e96" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:13.521881 [DEBUG] tlsutil: Update with version 1
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:13.524102 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3985c7b6-00aa-ebfb-36de-73a325dee06e Address:127.0.0.1:18826}]
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18826 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:78fad118-37a5-1dff-d901-a3681d436c14 Address:127.0.0.1:18820}]
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18820 [Follower] entering Follower state (Leader: "")
TestAgent_RemoveProxy - 2019/12/30 18:57:13.749594 [INFO] serf: EventMemberJoin: node1.dc1 127.0.0.1
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.752549 [INFO] serf: EventMemberJoin: node1.dc1 127.0.0.1
TestAgent_RemoveProxy - 2019/12/30 18:57:13.754021 [INFO] serf: EventMemberJoin: node1 127.0.0.1
TestAgent_RemoveProxy - 2019/12/30 18:57:13.755379 [INFO] agent: Started DNS server 127.0.0.1:18821 (udp)
TestAgent_RemoveProxy - 2019/12/30 18:57:13.755701 [INFO] consul: Adding LAN server node1 (Addr: tcp/127.0.0.1:18826) (DC: dc1)
TestAgent_RemoveProxy - 2019/12/30 18:57:13.755700 [INFO] consul: Handled member-join event for server "node1.dc1" in area "wan"
TestAgent_RemoveProxy - 2019/12/30 18:57:13.756335 [INFO] agent: Started DNS server 127.0.0.1:18821 (tcp)
TestAgent_RemoveProxy - 2019/12/30 18:57:13.758698 [INFO] agent: Started HTTP server on 127.0.0.1:18822 (tcp)
TestAgent_RemoveProxy - 2019/12/30 18:57:13.758818 [INFO] agent: started state syncer
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.760886 [INFO] serf: EventMemberJoin: node1 127.0.0.1
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.761978 [INFO] consul: Adding LAN server node1 (Addr: tcp/127.0.0.1:18820) (DC: dc1)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.762218 [INFO] consul: Handled member-join event for server "node1.dc1" in area "wan"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.765682 [INFO] agent: Started DNS server 127.0.0.1:18815 (udp)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.767581 [INFO] agent: Started DNS server 127.0.0.1:18815 (tcp)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.772510 [INFO] agent: Started HTTP server on 127.0.0.1:18816 (tcp)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:13.772633 [INFO] agent: started state syncer
2019/12/30 18:57:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18826 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18820 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:13 [INFO]  raft: Node at 127.0.0.1:18814 [Leader] entering Leader state
TestAgent_loadTokens - 2019/12/30 18:57:13.929165 [INFO] consul: cluster leadership acquired
TestAgent_loadTokens - 2019/12/30 18:57:13.929704 [INFO] consul: New leader elected: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe
TestAgent_loadTokens - 2019/12/30 18:57:14.080363 [INFO] acl: initializing acls
TestAgent_loadTokens - 2019/12/30 18:57:14.161712 [ERR] agent: failed to sync remote state: ACL not found
2019/12/30 18:57:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3f3822cc-8a95-bca3-bb75-f8ddff468e96 Address:127.0.0.1:18832}]
2019/12/30 18:57:14 [INFO]  raft: Node at 127.0.0.1:18832 [Follower] entering Follower state (Leader: "")
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.648747 [INFO] serf: EventMemberJoin: Node 3f3822cc-8a95-bca3-bb75-f8ddff468e96.dc1 127.0.0.1
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.657564 [INFO] serf: EventMemberJoin: Node 3f3822cc-8a95-bca3-bb75-f8ddff468e96 127.0.0.1
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.658473 [INFO] consul: Handled member-join event for server "Node 3f3822cc-8a95-bca3-bb75-f8ddff468e96.dc1" in area "wan"
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.658747 [INFO] consul: Adding LAN server Node 3f3822cc-8a95-bca3-bb75-f8ddff468e96 (Addr: tcp/127.0.0.1:18832) (DC: dc1)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.659870 [INFO] agent: Started DNS server 127.0.0.1:18827 (tcp)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.661161 [INFO] agent: Started DNS server 127.0.0.1:18827 (udp)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.661394 [DEBUG] tlsutil: IncomingHTTPSConfig with version 1
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.663563 [INFO] agent: Started HTTPS server on 127.0.0.1:18829 (tcp)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:14.663651 [INFO] agent: started state syncer
2019/12/30 18:57:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:14 [INFO]  raft: Node at 127.0.0.1:18832 [Candidate] entering Candidate state in term 2
TestAgent_loadTokens - 2019/12/30 18:57:14.786445 [ERR] agent: failed to sync remote state: ACL not found
2019/12/30 18:57:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:14 [INFO]  raft: Node at 127.0.0.1:18820 [Leader] entering Leader state
TestAgent_loadTokens - 2019/12/30 18:57:14.877796 [INFO] acl: initializing acls
TestAgent_loadTokens - 2019/12/30 18:57:14.878204 [INFO] consul: Created ACL 'global-management' policy
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:14.878599 [INFO] consul: cluster leadership acquired
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:14.879052 [INFO] consul: New leader elected: node1
2019/12/30 18:57:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:14 [INFO]  raft: Node at 127.0.0.1:18826 [Leader] entering Leader state
TestAgent_RemoveProxy - 2019/12/30 18:57:14.881642 [INFO] consul: cluster leadership acquired
TestAgent_RemoveProxy - 2019/12/30 18:57:14.882032 [INFO] consul: New leader elected: node1
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:14.964284 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestAgent_loadTokens - 2019/12/30 18:57:15.131136 [INFO] consul: Created ACL 'global-management' policy
TestAgent_RemoveProxy - 2019/12/30 18:57:15.296359 [INFO] agent: Synced node info
TestAgent_RemoveProxy - 2019/12/30 18:57:15.297367 [DEBUG] agent: removed check "service:web-proxy"
TestAgent_RemoveProxy - 2019/12/30 18:57:15.297434 [DEBUG] agent: removed service "web-proxy"
TestAgent_RemoveProxy - 2019/12/30 18:57:15.297693 [INFO] agent: Requesting shutdown
TestAgent_RemoveProxy - 2019/12/30 18:57:15.297750 [INFO] consul: shutting down server
TestAgent_RemoveProxy - 2019/12/30 18:57:15.297791 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:15 [INFO]  raft: Node at 127.0.0.1:18832 [Leader] entering Leader state
TestAgent_loadTokens - 2019/12/30 18:57:15.378637 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_loadTokens - 2019/12/30 18:57:15.378739 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgent_RemoveProxy - 2019/12/30 18:57:15.378733 [WARN] serf: Shutdown without a Leave
TestAgent_loadTokens - 2019/12/30 18:57:15.379700 [INFO] serf: EventMemberUpdate: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe
TestAgent_loadTokens - 2019/12/30 18:57:15.380398 [INFO] serf: EventMemberUpdate: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe.dc1
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:15.380909 [INFO] agent: Synced service "web"
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.382553 [INFO] consul: cluster leadership acquired
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.383010 [INFO] consul: New leader elected: Node 3f3822cc-8a95-bca3-bb75-f8ddff468e96
TestAgent_loadTokens - 2019/12/30 18:57:15.403328 [INFO] agent: Requesting shutdown
TestAgent_loadTokens - 2019/12/30 18:57:15.403426 [INFO] consul: shutting down server
TestAgent_loadTokens - 2019/12/30 18:57:15.403486 [WARN] serf: Shutdown without a Leave
TestAgent_RemoveProxy - 2019/12/30 18:57:15.452478 [INFO] manager: shutting down
TestAgent_RemoveProxy - 2019/12/30 18:57:15.453252 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_RemoveProxy - 2019/12/30 18:57:15.453488 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_RemoveProxy - 2019/12/30 18:57:15.454443 [INFO] agent: consul server down
TestAgent_RemoveProxy - 2019/12/30 18:57:15.454513 [INFO] agent: shutdown complete
TestAgent_RemoveProxy - 2019/12/30 18:57:15.454585 [INFO] agent: Stopping DNS server 127.0.0.1:18821 (tcp)
TestAgent_RemoveProxy - 2019/12/30 18:57:15.454758 [INFO] agent: Stopping DNS server 127.0.0.1:18821 (udp)
TestAgent_RemoveProxy - 2019/12/30 18:57:15.454988 [INFO] agent: Stopping HTTP server 127.0.0.1:18822 (tcp)
TestAgent_RemoveProxy - 2019/12/30 18:57:15.455251 [INFO] agent: Waiting for endpoints to shut down
TestAgent_RemoveProxy - 2019/12/30 18:57:15.455334 [INFO] agent: Endpoints down
--- PASS: TestAgent_RemoveProxy (2.71s)
=== CONT  TestAgent_reloadWatches
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.511842 [INFO] agent: Requesting shutdown
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.511941 [INFO] consul: shutting down server
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.511986 [WARN] serf: Shutdown without a Leave
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.512344 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.520539 [DEBUG] tlsutil: IncomingHTTPSConfig with version 1
2019/12/30 18:57:15 http: TLS handshake error from 127.0.0.1:58552: tls: certificate private key (<nil>) does not implement crypto.Signer
2019/12/30 18:57:15 [ERR] consul.watch: Watch (type: key) errored: Get https://127.0.0.1:18829/v1/kv/asdf: remote error: tls: internal error, retry in 5s
TestAgent_loadTokens - 2019/12/30 18:57:15.543968 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_reloadWatches - 2019/12/30 18:57:15.554540 [WARN] agent: Node name "Node d653a94c-6132-6650-5cf0-3078dc7c9eb2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_reloadWatches - 2019/12/30 18:57:15.555219 [DEBUG] tlsutil: Update with version 1
TestAgent_reloadWatches - 2019/12/30 18:57:15.557446 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_loadTokens - 2019/12/30 18:57:15.638978 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_loadTokens - 2019/12/30 18:57:15.639991 [INFO] serf: EventMemberUpdate: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.640433 [WARN] serf: Shutdown without a Leave
TestAgent_loadTokens - 2019/12/30 18:57:15.640746 [INFO] manager: shutting down
TestAgent_loadTokens - 2019/12/30 18:57:15.640831 [INFO] serf: EventMemberUpdate: Node d2c63416-95b7-2ba7-f93a-59f5bfa9d3fe.dc1
TestAgent_loadTokens - 2019/12/30 18:57:15.642274 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestAgent_loadTokens - 2019/12/30 18:57:15.642512 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_loadTokens - 2019/12/30 18:57:15.641163 [INFO] agent: consul server down
TestAgent_loadTokens - 2019/12/30 18:57:15.643742 [INFO] agent: shutdown complete
TestAgent_loadTokens - 2019/12/30 18:57:15.643795 [INFO] agent: Stopping DNS server 127.0.0.1:18809 (tcp)
TestAgent_loadTokens - 2019/12/30 18:57:15.644001 [INFO] agent: Stopping DNS server 127.0.0.1:18809 (udp)
TestAgent_loadTokens - 2019/12/30 18:57:15.644164 [INFO] agent: Stopping HTTP server 127.0.0.1:18810 (tcp)
TestAgent_loadTokens - 2019/12/30 18:57:15.644360 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadTokens - 2019/12/30 18:57:15.644492 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadTokens (3.34s)
=== CONT  TestAgent_Join_ACLDeny
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Join_ACLDeny - 2019/12/30 18:57:15.745845 [WARN] agent: Node name "Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Join_ACLDeny - 2019/12/30 18:57:15.746380 [DEBUG] tlsutil: Update with version 1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:15.748852 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.795878 [INFO] manager: shutting down
jones - 2019/12/30 18:57:15.912612 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef.dc1 (Addr: tcp/127.0.0.1:17710) (DC: dc1)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.969577 [INFO] agent: consul server down
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.969661 [INFO] agent: shutdown complete
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.969731 [INFO] agent: Stopping DNS server 127.0.0.1:18827 (tcp)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.969928 [INFO] agent: Stopping DNS server 127.0.0.1:18827 (udp)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.970163 [INFO] agent: Stopping HTTPS server 127.0.0.1:18829 (tcp)
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.970465 [INFO] agent: Waiting for endpoints to shut down
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.970556 [INFO] agent: Endpoints down
--- PASS: TestAgent_reloadWatchesHTTPS (2.51s)
=== CONT  TestAgent_loadCheckState
TestAgent_reloadWatchesHTTPS - 2019/12/30 18:57:15.973272 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadCheckState - 2019/12/30 18:57:16.039243 [WARN] agent: Node name "Node 6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadCheckState - 2019/12/30 18:57:16.039834 [DEBUG] tlsutil: Update with version 1
TestAgent_loadCheckState - 2019/12/30 18:57:16.043261 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.164812 [INFO] agent: Synced service "web-proxy"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.164913 [DEBUG] agent: Check "service:web-proxy" in sync
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.164960 [DEBUG] agent: Node info in sync
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.165180 [DEBUG] agent: Service "web" in sync
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.165234 [DEBUG] agent: Service "web-proxy" in sync
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.165279 [DEBUG] agent: Check "service:web-proxy" in sync
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.165320 [DEBUG] agent: Node info in sync
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.165485 [DEBUG] agent: removed check "service:web-proxy"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.165536 [DEBUG] agent: removed service "web-proxy"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.548167 [DEBUG] agent: removed check "service:web-proxy"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.548258 [DEBUG] agent: removed service "web-proxy"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.549519 [DEBUG] agent: purging stale persisted proxy "web-proxy"
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.550444 [INFO] agent: Requesting shutdown
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.550540 [INFO] consul: shutting down server
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.550586 [WARN] serf: Shutdown without a Leave
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.629826 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d653a94c-6132-6650-5cf0-3078dc7c9eb2 Address:127.0.0.1:18838}]
2019/12/30 18:57:16 [INFO]  raft: Node at 127.0.0.1:18838 [Follower] entering Follower state (Leader: "")
TestAgent_reloadWatches - 2019/12/30 18:57:16.635723 [INFO] serf: EventMemberJoin: Node d653a94c-6132-6650-5cf0-3078dc7c9eb2.dc1 127.0.0.1
TestAgent_reloadWatches - 2019/12/30 18:57:16.640683 [INFO] serf: EventMemberJoin: Node d653a94c-6132-6650-5cf0-3078dc7c9eb2 127.0.0.1
TestAgent_reloadWatches - 2019/12/30 18:57:16.642233 [INFO] consul: Adding LAN server Node d653a94c-6132-6650-5cf0-3078dc7c9eb2 (Addr: tcp/127.0.0.1:18838) (DC: dc1)
TestAgent_reloadWatches - 2019/12/30 18:57:16.643166 [INFO] consul: Handled member-join event for server "Node d653a94c-6132-6650-5cf0-3078dc7c9eb2.dc1" in area "wan"
TestAgent_reloadWatches - 2019/12/30 18:57:16.646017 [INFO] agent: Started DNS server 127.0.0.1:18833 (tcp)
TestAgent_reloadWatches - 2019/12/30 18:57:16.646201 [INFO] agent: Started DNS server 127.0.0.1:18833 (udp)
TestAgent_reloadWatches - 2019/12/30 18:57:16.654787 [INFO] agent: Started HTTP server on 127.0.0.1:18834 (tcp)
TestAgent_reloadWatches - 2019/12/30 18:57:16.654949 [INFO] agent: started state syncer
2019/12/30 18:57:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:16 [INFO]  raft: Node at 127.0.0.1:18838 [Candidate] entering Candidate state in term 2
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.737834 [INFO] manager: shutting down
2019/12/30 18:57:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d3e9f8b7-2512-8da9-bed8-314bbdcd41ce Address:127.0.0.1:18844}]
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.816520 [INFO] serf: EventMemberJoin: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce.dc1 127.0.0.1
2019/12/30 18:57:16 [INFO]  raft: Node at 127.0.0.1:18844 [Follower] entering Follower state (Leader: "")
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.822253 [INFO] serf: EventMemberJoin: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce 127.0.0.1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.824070 [INFO] consul: Adding LAN server Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce (Addr: tcp/127.0.0.1:18844) (DC: dc1)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.824972 [INFO] consul: Handled member-join event for server "Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce.dc1" in area "wan"
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.828873 [INFO] agent: Started DNS server 127.0.0.1:18839 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.829239 [INFO] agent: Started DNS server 127.0.0.1:18839 (udp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.833328 [INFO] agent: Started HTTP server on 127.0.0.1:18840 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:16.833436 [INFO] agent: started state syncer
2019/12/30 18:57:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:16 [INFO]  raft: Node at 127.0.0.1:18844 [Candidate] entering Candidate state in term 2
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.969704 [INFO] agent: consul server down
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.969854 [INFO] agent: shutdown complete
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.969961 [INFO] agent: Stopping DNS server 127.0.0.1:18815 (tcp)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.970160 [INFO] agent: Stopping DNS server 127.0.0.1:18815 (udp)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.970364 [INFO] agent: Stopping HTTP server 127.0.0.1:18816 (tcp)
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.970637 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.970780 [INFO] agent: Endpoints down
--- PASS: TestAgent_ReLoadProxiesFromConfig (4.25s)
=== CONT  TestAgent_persistCheckState
TestAgent_ReLoadProxiesFromConfig - 2019/12/30 18:57:16.971368 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_persistCheckState - 2019/12/30 18:57:17.060149 [WARN] agent: Node name "Node 1a812a8f-495b-85cb-7817-0c357e1ddf4c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_persistCheckState - 2019/12/30 18:57:17.060628 [DEBUG] tlsutil: Update with version 1
TestAgent_persistCheckState - 2019/12/30 18:57:17.062763 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c Address:127.0.0.1:18850}]
2019/12/30 18:57:17 [INFO]  raft: Node at 127.0.0.1:18850 [Follower] entering Follower state (Leader: "")
TestAgent_loadCheckState - 2019/12/30 18:57:17.074483 [INFO] serf: EventMemberJoin: Node 6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c.dc1 127.0.0.1
TestAgent_loadCheckState - 2019/12/30 18:57:17.077448 [INFO] serf: EventMemberJoin: Node 6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c 127.0.0.1
TestAgent_loadCheckState - 2019/12/30 18:57:17.078618 [INFO] agent: Started DNS server 127.0.0.1:18845 (udp)
TestAgent_loadCheckState - 2019/12/30 18:57:17.085268 [INFO] consul: Adding LAN server Node 6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c (Addr: tcp/127.0.0.1:18850) (DC: dc1)
TestAgent_loadCheckState - 2019/12/30 18:57:17.085305 [INFO] consul: Handled member-join event for server "Node 6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c.dc1" in area "wan"
TestAgent_loadCheckState - 2019/12/30 18:57:17.085797 [INFO] agent: Started DNS server 127.0.0.1:18845 (tcp)
TestAgent_loadCheckState - 2019/12/30 18:57:17.088220 [INFO] agent: Started HTTP server on 127.0.0.1:18846 (tcp)
TestAgent_loadCheckState - 2019/12/30 18:57:17.088315 [INFO] agent: started state syncer
2019/12/30 18:57:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:17 [INFO]  raft: Node at 127.0.0.1:18850 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:17 [INFO]  raft: Node at 127.0.0.1:18838 [Leader] entering Leader state
TestAgent_reloadWatches - 2019/12/30 18:57:17.287343 [INFO] consul: cluster leadership acquired
TestAgent_reloadWatches - 2019/12/30 18:57:17.288656 [INFO] consul: New leader elected: Node d653a94c-6132-6650-5cf0-3078dc7c9eb2
TestAgent_reloadWatches - 2019/12/30 18:57:17.311565 [INFO] agent: Requesting shutdown
TestAgent_reloadWatches - 2019/12/30 18:57:17.311688 [INFO] consul: shutting down server
TestAgent_reloadWatches - 2019/12/30 18:57:17.311738 [WARN] serf: Shutdown without a Leave
TestAgent_reloadWatches - 2019/12/30 18:57:17.312237 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_reloadWatches - 2019/12/30 18:57:17.529618 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:17 [INFO]  raft: Node at 127.0.0.1:18844 [Leader] entering Leader state
TestAgent_Join_ACLDeny - 2019/12/30 18:57:17.642178 [INFO] consul: cluster leadership acquired
TestAgent_Join_ACLDeny - 2019/12/30 18:57:17.642637 [INFO] consul: New leader elected: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce
TestAgent_reloadWatches - 2019/12/30 18:57:17.643169 [INFO] manager: shutting down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:17.765713 [ERR] agent: failed to sync remote state: ACL not found
TestAgent_reloadWatches - 2019/12/30 18:57:17.962226 [INFO] agent: consul server down
TestAgent_reloadWatches - 2019/12/30 18:57:17.962302 [INFO] agent: shutdown complete
TestAgent_reloadWatches - 2019/12/30 18:57:17.962394 [INFO] agent: Stopping DNS server 127.0.0.1:18833 (tcp)
TestAgent_reloadWatches - 2019/12/30 18:57:17.962540 [INFO] agent: Stopping DNS server 127.0.0.1:18833 (udp)
TestAgent_reloadWatches - 2019/12/30 18:57:17.962699 [INFO] agent: Stopping HTTP server 127.0.0.1:18834 (tcp)
TestAgent_reloadWatches - 2019/12/30 18:57:17.962914 [INFO] agent: Waiting for endpoints to shut down
TestAgent_reloadWatches - 2019/12/30 18:57:17.962983 [INFO] agent: Endpoints down
--- PASS: TestAgent_reloadWatches (2.51s)
=== CONT  TestAgent_loadChecks_checkFails
TestAgent_reloadWatches - 2019/12/30 18:57:17.971317 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:18.038957 [WARN] agent: Node name "Node 9c829afb-88aa-18cb-0a51-6e24b73537c1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:18.039580 [DEBUG] tlsutil: Update with version 1
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:18.042320 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.061299 [INFO] acl: initializing acls
2019/12/30 18:57:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:18 [INFO]  raft: Node at 127.0.0.1:18850 [Leader] entering Leader state
TestAgent_loadCheckState - 2019/12/30 18:57:18.066889 [INFO] consul: cluster leadership acquired
TestAgent_loadCheckState - 2019/12/30 18:57:18.067343 [INFO] consul: New leader elected: Node 6cc5d5d3-4c14-ec65-09d3-dd18abfdfe1c
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.254348 [INFO] consul: Created ACL 'global-management' policy
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.254516 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_loadCheckState - 2019/12/30 18:57:18.274305 [DEBUG] agent: check state expired for "check1", not restoring
TestAgent_loadCheckState - 2019/12/30 18:57:18.276418 [INFO] agent: Requesting shutdown
TestAgent_loadCheckState - 2019/12/30 18:57:18.276538 [INFO] consul: shutting down server
TestAgent_loadCheckState - 2019/12/30 18:57:18.276602 [WARN] serf: Shutdown without a Leave
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.376408 [INFO] acl: initializing acls
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.376659 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_loadCheckState - 2019/12/30 18:57:18.536201 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1a812a8f-495b-85cb-7817-0c357e1ddf4c Address:127.0.0.1:18856}]
2019/12/30 18:57:18 [INFO]  raft: Node at 127.0.0.1:18856 [Follower] entering Follower state (Leader: "")
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.540948 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_persistCheckState - 2019/12/30 18:57:18.548109 [INFO] serf: EventMemberJoin: Node 1a812a8f-495b-85cb-7817-0c357e1ddf4c.dc1 127.0.0.1
TestAgent_persistCheckState - 2019/12/30 18:57:18.558883 [INFO] serf: EventMemberJoin: Node 1a812a8f-495b-85cb-7817-0c357e1ddf4c 127.0.0.1
TestAgent_persistCheckState - 2019/12/30 18:57:18.563125 [INFO] consul: Adding LAN server Node 1a812a8f-495b-85cb-7817-0c357e1ddf4c (Addr: tcp/127.0.0.1:18856) (DC: dc1)
TestAgent_persistCheckState - 2019/12/30 18:57:18.564220 [INFO] consul: Handled member-join event for server "Node 1a812a8f-495b-85cb-7817-0c357e1ddf4c.dc1" in area "wan"
TestAgent_persistCheckState - 2019/12/30 18:57:18.566260 [INFO] agent: Started DNS server 127.0.0.1:18851 (tcp)
TestAgent_persistCheckState - 2019/12/30 18:57:18.567471 [INFO] agent: Started DNS server 127.0.0.1:18851 (udp)
TestAgent_persistCheckState - 2019/12/30 18:57:18.570329 [INFO] agent: Started HTTP server on 127.0.0.1:18852 (tcp)
TestAgent_persistCheckState - 2019/12/30 18:57:18.570463 [INFO] agent: started state syncer
2019/12/30 18:57:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:18 [INFO]  raft: Node at 127.0.0.1:18856 [Candidate] entering Candidate state in term 2
TestAgent_loadCheckState - 2019/12/30 18:57:18.610995 [INFO] manager: shutting down
TestAgent_loadCheckState - 2019/12/30 18:57:18.617362 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestAgent_loadCheckState - 2019/12/30 18:57:18.617591 [INFO] agent: consul server down
TestAgent_loadCheckState - 2019/12/30 18:57:18.617638 [INFO] agent: shutdown complete
TestAgent_loadCheckState - 2019/12/30 18:57:18.617689 [INFO] agent: Stopping DNS server 127.0.0.1:18845 (tcp)
TestAgent_loadCheckState - 2019/12/30 18:57:18.617860 [INFO] agent: Stopping DNS server 127.0.0.1:18845 (udp)
TestAgent_loadCheckState - 2019/12/30 18:57:18.618042 [INFO] agent: Stopping HTTP server 127.0.0.1:18846 (tcp)
TestAgent_loadCheckState - 2019/12/30 18:57:18.618296 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadCheckState - 2019/12/30 18:57:18.618377 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadCheckState (2.65s)
=== CONT  TestAgent_checkStateSnapshot
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_checkStateSnapshot - 2019/12/30 18:57:18.736072 [WARN] agent: Node name "Node 01cc8dfc-10ae-a1cc-7dc9-645cfc00c620" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_checkStateSnapshot - 2019/12/30 18:57:18.736998 [DEBUG] tlsutil: Update with version 1
TestAgent_checkStateSnapshot - 2019/12/30 18:57:18.743414 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:18.829562 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.035816 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.037710 [INFO] serf: EventMemberUpdate: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.039831 [INFO] serf: EventMemberUpdate: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce.dc1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.097777 [WARN] agent: Node name "Node b9be231d-50db-828b-0155-0ab7a72a9223" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.098297 [DEBUG] tlsutil: Update with version 1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.100698 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:19 [INFO]  raft: Node at 127.0.0.1:18856 [Leader] entering Leader state
2019/12/30 18:57:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9c829afb-88aa-18cb-0a51-6e24b73537c1 Address:127.0.0.1:18862}]
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.314055 [INFO] agent: Synced node info
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.314269 [DEBUG] agent: Node info in sync
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.314529 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.314583 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.315375 [INFO] serf: EventMemberUpdate: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce
TestAgent_Join_ACLDeny - 2019/12/30 18:57:19.316023 [INFO] serf: EventMemberUpdate: Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce.dc1
TestAgent_persistCheckState - 2019/12/30 18:57:19.317202 [INFO] consul: cluster leadership acquired
TestAgent_persistCheckState - 2019/12/30 18:57:19.317620 [INFO] consul: New leader elected: Node 1a812a8f-495b-85cb-7817-0c357e1ddf4c
2019/12/30 18:57:19 [INFO]  raft: Node at 127.0.0.1:18862 [Follower] entering Follower state (Leader: "")
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.320433 [INFO] serf: EventMemberJoin: Node 9c829afb-88aa-18cb-0a51-6e24b73537c1.dc1 127.0.0.1
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.330242 [INFO] serf: EventMemberJoin: Node 9c829afb-88aa-18cb-0a51-6e24b73537c1 127.0.0.1
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.332561 [INFO] consul: Adding LAN server Node 9c829afb-88aa-18cb-0a51-6e24b73537c1 (Addr: tcp/127.0.0.1:18862) (DC: dc1)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.333547 [INFO] consul: Handled member-join event for server "Node 9c829afb-88aa-18cb-0a51-6e24b73537c1.dc1" in area "wan"
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.339350 [INFO] agent: Started DNS server 127.0.0.1:18857 (tcp)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.340224 [INFO] agent: Started DNS server 127.0.0.1:18857 (udp)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.343141 [INFO] agent: Started HTTP server on 127.0.0.1:18858 (tcp)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:19.343241 [INFO] agent: started state syncer
2019/12/30 18:57:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:19 [INFO]  raft: Node at 127.0.0.1:18862 [Candidate] entering Candidate state in term 2
TestAgent_persistCheckState - 2019/12/30 18:57:19.408382 [INFO] agent: Requesting shutdown
TestAgent_persistCheckState - 2019/12/30 18:57:19.408470 [INFO] consul: shutting down server
TestAgent_persistCheckState - 2019/12/30 18:57:19.408543 [WARN] serf: Shutdown without a Leave
TestAgent_persistCheckState - 2019/12/30 18:57:19.408897 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_persistCheckState - 2019/12/30 18:57:19.510828 [WARN] serf: Shutdown without a Leave
jones - 2019/12/30 18:57:19.560560 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:19.560649 [DEBUG] agent: Node info in sync
TestAgent_persistCheckState - 2019/12/30 18:57:19.620261 [INFO] manager: shutting down
TestAgent_persistCheckState - 2019/12/30 18:57:19.712796 [INFO] agent: consul server down
TestAgent_persistCheckState - 2019/12/30 18:57:19.712904 [INFO] agent: shutdown complete
TestAgent_persistCheckState - 2019/12/30 18:57:19.713002 [INFO] agent: Stopping DNS server 127.0.0.1:18851 (tcp)
TestAgent_persistCheckState - 2019/12/30 18:57:19.713167 [INFO] agent: Stopping DNS server 127.0.0.1:18851 (udp)
TestAgent_persistCheckState - 2019/12/30 18:57:19.713336 [INFO] agent: Stopping HTTP server 127.0.0.1:18852 (tcp)
TestAgent_persistCheckState - 2019/12/30 18:57:19.713618 [INFO] agent: Waiting for endpoints to shut down
TestAgent_persistCheckState - 2019/12/30 18:57:19.713696 [INFO] agent: Endpoints down
--- PASS: TestAgent_persistCheckState (2.74s)
=== CONT  TestAgent_NodeMaintenanceMode
TestAgent_persistCheckState - 2019/12/30 18:57:19.719793 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:19.806330 [WARN] agent: Node name "Node 54813ffd-b04d-2641-9da8-5aacd71a7af0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:19.807050 [DEBUG] tlsutil: Update with version 1
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:19.811215 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:01cc8dfc-10ae-a1cc-7dc9-645cfc00c620 Address:127.0.0.1:18868}]
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.972148 [INFO] serf: EventMemberJoin: Node 01cc8dfc-10ae-a1cc-7dc9-645cfc00c620.dc1 127.0.0.1
2019/12/30 18:57:19 [INFO]  raft: Node at 127.0.0.1:18868 [Follower] entering Follower state (Leader: "")
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.976895 [INFO] serf: EventMemberJoin: Node 01cc8dfc-10ae-a1cc-7dc9-645cfc00c620 127.0.0.1
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.978647 [INFO] consul: Adding LAN server Node 01cc8dfc-10ae-a1cc-7dc9-645cfc00c620 (Addr: tcp/127.0.0.1:18868) (DC: dc1)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.978731 [INFO] agent: Started DNS server 127.0.0.1:18863 (udp)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.978926 [INFO] consul: Handled member-join event for server "Node 01cc8dfc-10ae-a1cc-7dc9-645cfc00c620.dc1" in area "wan"
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.979607 [INFO] agent: Started DNS server 127.0.0.1:18863 (tcp)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.982198 [INFO] agent: Started HTTP server on 127.0.0.1:18864 (tcp)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:19.982306 [INFO] agent: started state syncer
2019/12/30 18:57:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:18868 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:18862 [Leader] entering Leader state
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.096799 [INFO] consul: cluster leadership acquired
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.097283 [INFO] consul: New leader elected: Node 9c829afb-88aa-18cb-0a51-6e24b73537c1
2019/12/30 18:57:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b9be231d-50db-828b-0155-0ab7a72a9223 Address:127.0.0.1:18874}]
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.408593 [INFO] serf: EventMemberJoin: Node b9be231d-50db-828b-0155-0ab7a72a9223.dc1 127.0.0.1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.417337 [INFO] serf: EventMemberJoin: Node b9be231d-50db-828b-0155-0ab7a72a9223 127.0.0.1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.420622 [INFO] agent: Started DNS server 127.0.0.1:18869 (udp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.422651 [INFO] agent: Started DNS server 127.0.0.1:18869 (tcp)
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:18874 [Follower] entering Follower state (Leader: "")
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.421104 [INFO] consul: Handled member-join event for server "Node b9be231d-50db-828b-0155-0ab7a72a9223.dc1" in area "wan"
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.422485 [INFO] consul: Adding LAN server Node b9be231d-50db-828b-0155-0ab7a72a9223 (Addr: tcp/127.0.0.1:18874) (DC: dc1)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.429704 [INFO] agent: Started HTTP server on 127.0.0.1:18870 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.430335 [INFO] agent: started state syncer
2019/12/30 18:57:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:18874 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:20 [ERR] consul.watch: Watch (type: key) errored: Get https://127.0.0.1:18829/v1/kv/asdf: dial tcp 127.0.0.1:18829: connect: connection refused, retry in 20s
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.586611 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.587159 [DEBUG] consul: Skipping self join check for "Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce" since the cluster is too small
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.587320 [INFO] consul: member 'Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce' joined, marking health alive
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.592786 [INFO] agent: Synced node info
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.595353 [DEBUG] agent: Node info in sync
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.768742 [WARN] agent: Failed to restore check "service:redis": ServiceID "nope" does not exist
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.770314 [DEBUG] agent: restored health check "service:redis" from "/tmp/TestAgent_loadChecks_checkFails-agent959715518/checks/60a2ef12de014a05ecdc850d9aab46da"
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.770587 [INFO] agent: Requesting shutdown
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.770770 [INFO] consul: shutting down server
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.771092 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:18868 [Leader] entering Leader state
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.871852 [DEBUG] consul: Skipping self join check for "Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce" since the cluster is too small
TestAgent_Join_ACLDeny - 2019/12/30 18:57:20.872383 [DEBUG] consul: Skipping self join check for "Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce" since the cluster is too small
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.873603 [WARN] serf: Shutdown without a Leave
TestAgent_checkStateSnapshot - 2019/12/30 18:57:20.874066 [INFO] consul: cluster leadership acquired
TestAgent_checkStateSnapshot - 2019/12/30 18:57:20.874689 [INFO] consul: New leader elected: Node 01cc8dfc-10ae-a1cc-7dc9-645cfc00c620
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:20.987965 [INFO] manager: shutting down
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.077968 [INFO] agent: consul server down
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.078062 [INFO] agent: shutdown complete
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.078155 [INFO] agent: Stopping DNS server 127.0.0.1:18857 (tcp)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.078336 [INFO] agent: Stopping DNS server 127.0.0.1:18857 (udp)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.078539 [INFO] agent: Stopping HTTP server 127.0.0.1:18858 (tcp)
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.078795 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.078887 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadChecks_checkFails (3.12s)
=== CONT  TestAgent_AddCheck_restoresSnapshot
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.088470 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_loadChecks_checkFails - 2019/12/30 18:57:21.088762 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:21.156028 [WARN] agent: Node name "Node 81d17a14-8e0e-bf8d-1c99-2971c2fc2963" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:21.156849 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:21.159145 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:21 [INFO]  raft: Node at 127.0.0.1:18874 [Leader] entering Leader state
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.180404 [INFO] consul: cluster leadership acquired
2019/12/30 18:57:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:54813ffd-b04d-2641-9da8-5aacd71a7af0 Address:127.0.0.1:18880}]
2019/12/30 18:57:21 [INFO]  raft: Node at 127.0.0.1:18880 [Follower] entering Follower state (Leader: "")
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.180865 [INFO] consul: New leader elected: Node b9be231d-50db-828b-0155-0ab7a72a9223
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.185036 [INFO] serf: EventMemberJoin: Node 54813ffd-b04d-2641-9da8-5aacd71a7af0.dc1 127.0.0.1
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.192705 [INFO] serf: EventMemberJoin: Node 54813ffd-b04d-2641-9da8-5aacd71a7af0 127.0.0.1
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.193823 [INFO] consul: Adding LAN server Node 54813ffd-b04d-2641-9da8-5aacd71a7af0 (Addr: tcp/127.0.0.1:18880) (DC: dc1)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.194973 [INFO] consul: Handled member-join event for server "Node 54813ffd-b04d-2641-9da8-5aacd71a7af0.dc1" in area "wan"
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.195195 [INFO] agent: Started DNS server 127.0.0.1:18875 (tcp)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.195975 [INFO] agent: Started DNS server 127.0.0.1:18875 (udp)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.200223 [INFO] agent: Started HTTP server on 127.0.0.1:18876 (tcp)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.200343 [INFO] agent: started state syncer
2019/12/30 18:57:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:21 [INFO]  raft: Node at 127.0.0.1:18880 [Candidate] entering Candidate state in term 2
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.350325 [DEBUG] consul: dropping node "Node d3e9f8b7-2512-8da9-bed8-314bbdcd41ce" from result due to ACLs
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.357635 [INFO] agent: Synced node info
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.459537 [DEBUG] agent: removed check "service:redis"
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.474586 [DEBUG] agent: restored health check "service:redis" from "/tmp/TestAgent_checkStateSnapshot-agent724462853/checks/60a2ef12de014a05ecdc850d9aab46da"
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.475124 [INFO] agent: Requesting shutdown
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.475312 [INFO] consul: shutting down server
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.475492 [WARN] serf: Shutdown without a Leave
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.531624 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.630375 [WARN] serf: Shutdown without a Leave
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.633495 [INFO] agent: Synced node info
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.654322 [INFO] agent: Requesting shutdown
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.654674 [INFO] consul: shutting down server
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.654810 [WARN] serf: Shutdown without a Leave
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.730063 [INFO] manager: shutting down
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.730867 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.731745 [INFO] agent: consul server down
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.732418 [INFO] agent: shutdown complete
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.732558 [INFO] agent: Stopping DNS server 127.0.0.1:18863 (tcp)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.732810 [INFO] agent: Stopping DNS server 127.0.0.1:18863 (udp)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.733054 [INFO] agent: Stopping HTTP server 127.0.0.1:18864 (tcp)
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.733351 [INFO] agent: Waiting for endpoints to shut down
TestAgent_checkStateSnapshot - 2019/12/30 18:57:21.734505 [INFO] agent: Endpoints down
--- PASS: TestAgent_checkStateSnapshot (3.12s)
=== CONT  TestAgent_AddService_restoresSnapshot
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:21.798527 [WARN] agent: Node name "Node c82e4a68-a03c-7bbb-e585-04ceaa19f71c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:21.799186 [DEBUG] tlsutil: Update with version 1
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:21.802008 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.852553 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:21 [INFO]  raft: Node at 127.0.0.1:18880 [Leader] entering Leader state
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.953640 [INFO] consul: cluster leadership acquired
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:21.954251 [INFO] consul: New leader elected: Node 54813ffd-b04d-2641-9da8-5aacd71a7af0
TestAgent_Join_ACLDeny - 2019/12/30 18:57:21.955081 [INFO] manager: shutting down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.054520 [INFO] agent: consul server down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.054602 [INFO] agent: shutdown complete
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.054668 [INFO] agent: Stopping DNS server 127.0.0.1:18869 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.054857 [INFO] agent: Stopping DNS server 127.0.0.1:18869 (udp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055056 [INFO] agent: Stopping HTTP server 127.0.0.1:18870 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055329 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055421 [INFO] agent: Endpoints down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055466 [INFO] agent: Requesting shutdown
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055523 [INFO] consul: shutting down server
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055570 [WARN] serf: Shutdown without a Leave
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.055865 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.062089 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.152595 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:81d17a14-8e0e-bf8d-1c99-2971c2fc2963 Address:127.0.0.1:18886}]
2019/12/30 18:57:22 [INFO]  raft: Node at 127.0.0.1:18886 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.257707 [INFO] serf: EventMemberJoin: Node 81d17a14-8e0e-bf8d-1c99-2971c2fc2963.dc1 127.0.0.1
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.258021 [INFO] manager: shutting down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.258883 [INFO] agent: consul server down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.258942 [INFO] agent: shutdown complete
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.259000 [INFO] agent: Stopping DNS server 127.0.0.1:18839 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.259185 [INFO] agent: Stopping DNS server 127.0.0.1:18839 (udp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.259356 [INFO] agent: Stopping HTTP server 127.0.0.1:18840 (tcp)
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.259642 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Join_ACLDeny - 2019/12/30 18:57:22.259719 [INFO] agent: Endpoints down
--- PASS: TestAgent_Join_ACLDeny (6.62s)
=== CONT  TestAgent_Service_MaintenanceMode
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.275446 [INFO] serf: EventMemberJoin: Node 81d17a14-8e0e-bf8d-1c99-2971c2fc2963 127.0.0.1
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.283541 [INFO] consul: Adding LAN server Node 81d17a14-8e0e-bf8d-1c99-2971c2fc2963 (Addr: tcp/127.0.0.1:18886) (DC: dc1)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.283664 [INFO] consul: Handled member-join event for server "Node 81d17a14-8e0e-bf8d-1c99-2971c2fc2963.dc1" in area "wan"
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.285157 [INFO] agent: Started DNS server 127.0.0.1:18881 (tcp)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.285460 [INFO] agent: Started DNS server 127.0.0.1:18881 (udp)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.288867 [INFO] agent: Started HTTP server on 127.0.0.1:18882 (tcp)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.289012 [INFO] agent: started state syncer
2019/12/30 18:57:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:22 [INFO]  raft: Node at 127.0.0.1:18886 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:22.331694 [WARN] agent: Node name "Node c9e3f2d1-101f-b327-7a93-d3abb386e2f3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:22.332346 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:22.334838 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.353471 [INFO] agent: Synced node info
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.448851 [INFO] agent: Node entered maintenance mode
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.451436 [DEBUG] agent: removed check "_node_maintenance"
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.451783 [INFO] agent: Node left maintenance mode
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.542780 [INFO] agent: Node entered maintenance mode
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.543013 [INFO] agent: Requesting shutdown
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.543147 [INFO] consul: shutting down server
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.543323 [WARN] serf: Shutdown without a Leave
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.688127 [WARN] serf: Shutdown without a Leave
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.777643 [INFO] manager: shutting down
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.779847 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.779941 [INFO] agent: consul server down
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.779987 [INFO] agent: shutdown complete
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.780038 [INFO] agent: Stopping DNS server 127.0.0.1:18875 (tcp)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.780172 [INFO] agent: Stopping DNS server 127.0.0.1:18875 (udp)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.780340 [INFO] agent: Stopping HTTP server 127.0.0.1:18876 (tcp)
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.780559 [INFO] agent: Waiting for endpoints to shut down
TestAgent_NodeMaintenanceMode - 2019/12/30 18:57:22.780628 [INFO] agent: Endpoints down
--- PASS: TestAgent_NodeMaintenanceMode (3.07s)
=== CONT  TestAgent_unloadProxies
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_unloadProxies - 2019/12/30 18:57:22.853339 [WARN] agent: Node name "Node 3edb0bb9-8d4e-2f91-9cdd-0a1d75100374" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_unloadProxies - 2019/12/30 18:57:22.854045 [DEBUG] tlsutil: Update with version 1
TestAgent_unloadProxies - 2019/12/30 18:57:22.856827 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c82e4a68-a03c-7bbb-e585-04ceaa19f71c Address:127.0.0.1:18892}]
2019/12/30 18:57:22 [INFO]  raft: Node at 127.0.0.1:18892 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:22 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:22 [INFO]  raft: Node at 127.0.0.1:18886 [Leader] entering Leader state
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.948125 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:22.948613 [INFO] consul: New leader elected: Node 81d17a14-8e0e-bf8d-1c99-2971c2fc2963
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.951355 [INFO] serf: EventMemberJoin: Node c82e4a68-a03c-7bbb-e585-04ceaa19f71c.dc1 127.0.0.1
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.971090 [INFO] serf: EventMemberJoin: Node c82e4a68-a03c-7bbb-e585-04ceaa19f71c 127.0.0.1
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.972584 [INFO] consul: Adding LAN server Node c82e4a68-a03c-7bbb-e585-04ceaa19f71c (Addr: tcp/127.0.0.1:18892) (DC: dc1)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.973245 [INFO] consul: Handled member-join event for server "Node c82e4a68-a03c-7bbb-e585-04ceaa19f71c.dc1" in area "wan"
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.975962 [INFO] agent: Started DNS server 127.0.0.1:18887 (tcp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.976084 [INFO] agent: Started DNS server 127.0.0.1:18887 (udp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.981344 [INFO] agent: Started HTTP server on 127.0.0.1:18888 (tcp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:22.981471 [INFO] agent: started state syncer
2019/12/30 18:57:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:23 [INFO]  raft: Node at 127.0.0.1:18892 [Candidate] entering Candidate state in term 2
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.412031 [INFO] agent: Synced node info
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.412576 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.412665 [INFO] consul: shutting down server
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.412716 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c9e3f2d1-101f-b327-7a93-d3abb386e2f3 Address:127.0.0.1:18898}]
2019/12/30 18:57:23 [INFO]  raft: Node at 127.0.0.1:18898 [Follower] entering Follower state (Leader: "")
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.523594 [INFO] serf: EventMemberJoin: Node c9e3f2d1-101f-b327-7a93-d3abb386e2f3.dc1 127.0.0.1
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.528096 [WARN] serf: Shutdown without a Leave
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.550799 [INFO] serf: EventMemberJoin: Node c9e3f2d1-101f-b327-7a93-d3abb386e2f3 127.0.0.1
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.551619 [INFO] consul: Handled member-join event for server "Node c9e3f2d1-101f-b327-7a93-d3abb386e2f3.dc1" in area "wan"
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.551930 [INFO] consul: Adding LAN server Node c9e3f2d1-101f-b327-7a93-d3abb386e2f3 (Addr: tcp/127.0.0.1:18898) (DC: dc1)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.552482 [INFO] agent: Started DNS server 127.0.0.1:18893 (tcp)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.557530 [INFO] agent: Started DNS server 127.0.0.1:18893 (udp)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.560205 [INFO] agent: Started HTTP server on 127.0.0.1:18894 (tcp)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:23.560337 [INFO] agent: started state syncer
2019/12/30 18:57:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:23 [INFO]  raft: Node at 127.0.0.1:18898 [Candidate] entering Candidate state in term 2
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.703968 [INFO] manager: shutting down
2019/12/30 18:57:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:23 [INFO]  raft: Node at 127.0.0.1:18892 [Leader] entering Leader state
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:23.813487 [INFO] consul: cluster leadership acquired
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:23.813878 [INFO] consul: New leader elected: Node c82e4a68-a03c-7bbb-e585-04ceaa19f71c
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.814632 [INFO] agent: consul server down
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.814693 [INFO] agent: shutdown complete
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.814762 [INFO] agent: Stopping DNS server 127.0.0.1:18881 (tcp)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.814899 [INFO] agent: Stopping DNS server 127.0.0.1:18881 (udp)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.815111 [INFO] agent: Stopping HTTP server 127.0.0.1:18882 (tcp)
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.815361 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.815449 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_restoresSnapshot (2.74s)
=== CONT  TestAgent_loadProxies_nilProxy
TestAgent_AddCheck_restoresSnapshot - 2019/12/30 18:57:23.819551 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:23.952670 [WARN] agent: Node name "Node fc634ec6-b820-c0dc-e02e-da568d2453a2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:23.953260 [DEBUG] tlsutil: Update with version 1
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:23.955742 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.050839 [INFO] agent: Requesting shutdown
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.050926 [INFO] consul: shutting down server
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.050977 [WARN] serf: Shutdown without a Leave
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.051340 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:57:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3edb0bb9-8d4e-2f91-9cdd-0a1d75100374 Address:127.0.0.1:18904}]
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.146758 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:24 [INFO]  raft: Node at 127.0.0.1:18904 [Follower] entering Follower state (Leader: "")
TestAgent_unloadProxies - 2019/12/30 18:57:24.162430 [INFO] serf: EventMemberJoin: Node 3edb0bb9-8d4e-2f91-9cdd-0a1d75100374.dc1 127.0.0.1
TestAgent_unloadProxies - 2019/12/30 18:57:24.167847 [INFO] serf: EventMemberJoin: Node 3edb0bb9-8d4e-2f91-9cdd-0a1d75100374 127.0.0.1
TestAgent_unloadProxies - 2019/12/30 18:57:24.168724 [INFO] consul: Adding LAN server Node 3edb0bb9-8d4e-2f91-9cdd-0a1d75100374 (Addr: tcp/127.0.0.1:18904) (DC: dc1)
TestAgent_unloadProxies - 2019/12/30 18:57:24.169256 [INFO] consul: Handled member-join event for server "Node 3edb0bb9-8d4e-2f91-9cdd-0a1d75100374.dc1" in area "wan"
2019/12/30 18:57:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:24 [INFO]  raft: Node at 127.0.0.1:18904 [Candidate] entering Candidate state in term 2
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.264088 [INFO] manager: shutting down
2019/12/30 18:57:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:24 [INFO]  raft: Node at 127.0.0.1:18898 [Leader] entering Leader state
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:24.445383 [INFO] consul: cluster leadership acquired
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:24.445886 [INFO] consul: New leader elected: Node c9e3f2d1-101f-b327-7a93-d3abb386e2f3
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.552881 [INFO] agent: consul server down
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.552981 [INFO] agent: shutdown complete
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.553054 [INFO] agent: Stopping DNS server 127.0.0.1:18887 (tcp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.553254 [INFO] agent: Stopping DNS server 127.0.0.1:18887 (udp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.553436 [INFO] agent: Stopping HTTP server 127.0.0.1:18888 (tcp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.553683 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.553760 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddService_restoresSnapshot (2.82s)
=== CONT  TestAgent_loadProxies
TestAgent_unloadProxies - 2019/12/30 18:57:24.558028 [DEBUG] agent: check "service:rabbitmq-proxy" exists, not restoring from "/tmp/TestAgent_unloadProxies-agent621403427/checks/ad03f99eaa314ba8c60a056e4332f189"
TestAgent_unloadProxies - 2019/12/30 18:57:24.559595 [INFO] agent: Started DNS server 127.0.0.1:18899 (udp)
TestAgent_unloadProxies - 2019/12/30 18:57:24.559999 [INFO] agent: Started DNS server 127.0.0.1:18899 (tcp)
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.560086 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.560328 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.560501 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.560556 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.560603 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestAgent_AddService_restoresSnapshot - 2019/12/30 18:57:24.560651 [ERR] consul: failed to transfer leadership in 3 attempts
TestAgent_unloadProxies - 2019/12/30 18:57:24.562559 [INFO] agent: Started HTTP server on 127.0.0.1:18900 (tcp)
TestAgent_unloadProxies - 2019/12/30 18:57:24.562673 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadProxies - 2019/12/30 18:57:24.698562 [WARN] agent: Node name "Node 64ac38ad-806b-f026-065a-588e078c46cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadProxies - 2019/12/30 18:57:24.699113 [DEBUG] tlsutil: Update with version 1
TestAgent_loadProxies - 2019/12/30 18:57:24.702128 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:24.903642 [INFO] agent: Synced node info
2019/12/30 18:57:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:24 [INFO]  raft: Node at 127.0.0.1:18904 [Leader] entering Leader state
TestAgent_unloadProxies - 2019/12/30 18:57:24.906389 [INFO] consul: cluster leadership acquired
TestAgent_unloadProxies - 2019/12/30 18:57:24.906874 [INFO] consul: New leader elected: Node 3edb0bb9-8d4e-2f91-9cdd-0a1d75100374
TestAgent_unloadProxies - 2019/12/30 18:57:24.941380 [DEBUG] agent: removed check "service:rabbitmq-proxy"
TestAgent_unloadProxies - 2019/12/30 18:57:24.941468 [DEBUG] agent: removed service "rabbitmq-proxy"
TestAgent_unloadProxies - 2019/12/30 18:57:24.941516 [INFO] agent: Requesting shutdown
TestAgent_unloadProxies - 2019/12/30 18:57:24.941571 [INFO] consul: shutting down server
TestAgent_unloadProxies - 2019/12/30 18:57:24.941630 [WARN] serf: Shutdown without a Leave
TestAgent_unloadProxies - 2019/12/30 18:57:24.941829 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.022289 [INFO] agent: Service "redis" entered maintenance mode
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.023537 [DEBUG] agent: removed check "_service_maintenance:redis"
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.023635 [INFO] agent: Service "redis" left maintenance mode
TestAgent_unloadProxies - 2019/12/30 18:57:25.135963 [WARN] serf: Shutdown without a Leave
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.140041 [INFO] agent: Service "redis" entered maintenance mode
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.140184 [INFO] agent: Requesting shutdown
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.140260 [INFO] consul: shutting down server
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.140324 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fc634ec6-b820-c0dc-e02e-da568d2453a2 Address:127.0.0.1:18910}]
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.146983 [INFO] serf: EventMemberJoin: Node fc634ec6-b820-c0dc-e02e-da568d2453a2.dc1 127.0.0.1
2019/12/30 18:57:25 [INFO]  raft: Node at 127.0.0.1:18910 [Follower] entering Follower state (Leader: "")
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.152932 [INFO] serf: EventMemberJoin: Node fc634ec6-b820-c0dc-e02e-da568d2453a2 127.0.0.1
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.154538 [INFO] consul: Adding LAN server Node fc634ec6-b820-c0dc-e02e-da568d2453a2 (Addr: tcp/127.0.0.1:18910) (DC: dc1)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.157202 [INFO] consul: Handled member-join event for server "Node fc634ec6-b820-c0dc-e02e-da568d2453a2.dc1" in area "wan"
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.160846 [INFO] agent: Started DNS server 127.0.0.1:18905 (tcp)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.161632 [INFO] agent: Started DNS server 127.0.0.1:18905 (udp)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.164343 [INFO] agent: Started HTTP server on 127.0.0.1:18906 (tcp)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:25.187906 [INFO] agent: started state syncer
2019/12/30 18:57:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:25 [INFO]  raft: Node at 127.0.0.1:18910 [Candidate] entering Candidate state in term 2
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.253862 [WARN] serf: Shutdown without a Leave
TestAgent_unloadProxies - 2019/12/30 18:57:25.263663 [INFO] manager: shutting down
TestAgent_unloadProxies - 2019/12/30 18:57:25.352029 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_unloadProxies - 2019/12/30 18:57:25.352485 [INFO] agent: consul server down
TestAgent_unloadProxies - 2019/12/30 18:57:25.352538 [INFO] agent: shutdown complete
TestAgent_unloadProxies - 2019/12/30 18:57:25.352593 [INFO] agent: Stopping DNS server 127.0.0.1:18899 (tcp)
TestAgent_unloadProxies - 2019/12/30 18:57:25.352736 [INFO] agent: Stopping DNS server 127.0.0.1:18899 (udp)
TestAgent_unloadProxies - 2019/12/30 18:57:25.352896 [INFO] agent: Stopping HTTP server 127.0.0.1:18900 (tcp)
TestAgent_unloadProxies - 2019/12/30 18:57:25.353104 [INFO] agent: Waiting for endpoints to shut down
TestAgent_unloadProxies - 2019/12/30 18:57:25.353180 [INFO] agent: Endpoints down
--- PASS: TestAgent_unloadProxies (2.57s)
=== CONT  TestAgent_unloadServices
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.354251 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_unloadServices - 2019/12/30 18:57:25.470627 [WARN] agent: Node name "Node 11f054ad-81ed-4603-37ee-987d1c01aa42" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_unloadServices - 2019/12/30 18:57:25.471120 [DEBUG] tlsutil: Update with version 1
TestAgent_unloadServices - 2019/12/30 18:57:25.473810 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.894678 [INFO] agent: consul server down
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.894765 [INFO] agent: shutdown complete
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.894830 [INFO] agent: Stopping DNS server 127.0.0.1:18893 (tcp)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.894999 [INFO] agent: Stopping DNS server 127.0.0.1:18893 (udp)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.895178 [INFO] agent: Stopping HTTP server 127.0.0.1:18894 (tcp)
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.895438 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.895526 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_MaintenanceMode (3.64s)
=== CONT  TestAgent_loadServices_sidecarOverrideMeta
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.895767 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.895973 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.896036 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.896086 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestAgent_Service_MaintenanceMode - 2019/12/30 18:57:25.896132 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:25.963540 [WARN] agent: Node name "Node c06e1a84-48b5-2378-eddd-6bfca7ee9010" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:25.964286 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:25.966977 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:64ac38ad-806b-f026-065a-588e078c46cf Address:127.0.0.1:18916}]
TestAgent_loadProxies - 2019/12/30 18:57:26.048135 [INFO] serf: EventMemberJoin: Node 64ac38ad-806b-f026-065a-588e078c46cf.dc1 127.0.0.1
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18916 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18910 [Leader] entering Leader state
TestAgent_loadProxies - 2019/12/30 18:57:26.052250 [INFO] serf: EventMemberJoin: Node 64ac38ad-806b-f026-065a-588e078c46cf 127.0.0.1
TestAgent_loadProxies - 2019/12/30 18:57:26.054056 [INFO] consul: Adding LAN server Node 64ac38ad-806b-f026-065a-588e078c46cf (Addr: tcp/127.0.0.1:18916) (DC: dc1)
TestAgent_loadProxies - 2019/12/30 18:57:26.054306 [INFO] consul: Handled member-join event for server "Node 64ac38ad-806b-f026-065a-588e078c46cf.dc1" in area "wan"
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.054599 [INFO] consul: cluster leadership acquired
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.055043 [INFO] consul: New leader elected: Node fc634ec6-b820-c0dc-e02e-da568d2453a2
2019/12/30 18:57:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18916 [Candidate] entering Candidate state in term 2
TestAgent_loadProxies - 2019/12/30 18:57:26.423878 [DEBUG] agent: check "service:rabbitmq-proxy" exists, not restoring from "/tmp/TestAgent_loadProxies-agent511864141/checks/ad03f99eaa314ba8c60a056e4332f189"
TestAgent_loadProxies - 2019/12/30 18:57:26.425659 [INFO] agent: Started DNS server 127.0.0.1:18911 (udp)
TestAgent_loadProxies - 2019/12/30 18:57:26.426551 [INFO] agent: Started DNS server 127.0.0.1:18911 (tcp)
TestAgent_loadProxies - 2019/12/30 18:57:26.429273 [INFO] agent: Started HTTP server on 127.0.0.1:18912 (tcp)
TestAgent_loadProxies - 2019/12/30 18:57:26.435326 [INFO] agent: started state syncer
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.521239 [INFO] agent: Synced service "rabbitmq"
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.521316 [DEBUG] agent: Node info in sync
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.521691 [INFO] agent: Requesting shutdown
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.521763 [INFO] consul: shutting down server
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.521806 [WARN] serf: Shutdown without a Leave
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.629832 [WARN] serf: Shutdown without a Leave
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.706036 [DEBUG] agent: Service "rabbitmq" in sync
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.706134 [DEBUG] agent: Node info in sync
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.706269 [DEBUG] agent: Service "rabbitmq" in sync
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.706315 [DEBUG] agent: Node info in sync
2019/12/30 18:57:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:11f054ad-81ed-4603-37ee-987d1c01aa42 Address:127.0.0.1:18922}]
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18922 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18916 [Leader] entering Leader state
TestAgent_loadProxies - 2019/12/30 18:57:26.727046 [INFO] consul: cluster leadership acquired
TestAgent_loadProxies - 2019/12/30 18:57:26.727525 [INFO] consul: New leader elected: Node 64ac38ad-806b-f026-065a-588e078c46cf
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:26.740759 [INFO] manager: shutting down
TestAgent_unloadServices - 2019/12/30 18:57:26.759814 [INFO] serf: EventMemberJoin: Node 11f054ad-81ed-4603-37ee-987d1c01aa42.dc1 127.0.0.1
2019/12/30 18:57:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18922 [Candidate] entering Candidate state in term 2
TestAgent_unloadServices - 2019/12/30 18:57:26.780167 [INFO] serf: EventMemberJoin: Node 11f054ad-81ed-4603-37ee-987d1c01aa42 127.0.0.1
TestAgent_unloadServices - 2019/12/30 18:57:26.782120 [INFO] consul: Adding LAN server Node 11f054ad-81ed-4603-37ee-987d1c01aa42 (Addr: tcp/127.0.0.1:18922) (DC: dc1)
TestAgent_unloadServices - 2019/12/30 18:57:26.790361 [INFO] consul: Handled member-join event for server "Node 11f054ad-81ed-4603-37ee-987d1c01aa42.dc1" in area "wan"
TestAgent_unloadServices - 2019/12/30 18:57:26.793620 [INFO] agent: Started DNS server 127.0.0.1:18917 (tcp)
TestAgent_unloadServices - 2019/12/30 18:57:26.794082 [INFO] agent: Started DNS server 127.0.0.1:18917 (udp)
TestAgent_unloadServices - 2019/12/30 18:57:26.797349 [INFO] agent: Started HTTP server on 127.0.0.1:18918 (tcp)
TestAgent_unloadServices - 2019/12/30 18:57:26.797479 [INFO] agent: started state syncer
TestAgent_loadProxies - 2019/12/30 18:57:26.925463 [ERR] leaf watch error: invalid type for leaf response: <nil>
2019/12/30 18:57:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c06e1a84-48b5-2378-eddd-6bfca7ee9010 Address:127.0.0.1:18928}]
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:26.999229 [INFO] serf: EventMemberJoin: Node c06e1a84-48b5-2378-eddd-6bfca7ee9010.dc1 127.0.0.1
2019/12/30 18:57:26 [INFO]  raft: Node at 127.0.0.1:18928 [Follower] entering Follower state (Leader: "")
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.003528 [INFO] serf: EventMemberJoin: Node c06e1a84-48b5-2378-eddd-6bfca7ee9010 127.0.0.1
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.004713 [INFO] consul: Adding LAN server Node c06e1a84-48b5-2378-eddd-6bfca7ee9010 (Addr: tcp/127.0.0.1:18928) (DC: dc1)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.004973 [INFO] consul: Handled member-join event for server "Node c06e1a84-48b5-2378-eddd-6bfca7ee9010.dc1" in area "wan"
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.005481 [INFO] agent: Started DNS server 127.0.0.1:18923 (udp)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.005904 [INFO] agent: Started DNS server 127.0.0.1:18923 (tcp)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.008269 [INFO] agent: Started HTTP server on 127.0.0.1:18924 (tcp)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.008415 [INFO] agent: started state syncer
2019/12/30 18:57:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:27 [INFO]  raft: Node at 127.0.0.1:18928 [Candidate] entering Candidate state in term 2
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.069577 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.069806 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.069878 [INFO] agent: consul server down
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.069927 [INFO] agent: shutdown complete
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.069981 [INFO] agent: Stopping DNS server 127.0.0.1:18905 (tcp)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.070135 [INFO] agent: Stopping DNS server 127.0.0.1:18905 (udp)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.070307 [INFO] agent: Stopping HTTP server 127.0.0.1:18906 (tcp)
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.070579 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadProxies_nilProxy - 2019/12/30 18:57:27.070651 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadProxies_nilProxy (3.26s)
=== CONT  TestAgent_loadServices_sidecarInheritMeta
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:27.136982 [WARN] agent: Node name "Node 6caecb47-feab-2953-2b78-bcc7d53781e4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:27.137500 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:27.140014 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_loadProxies - 2019/12/30 18:57:27.246885 [INFO] agent: Synced service "rabbitmq-proxy"
2019/12/30 18:57:27 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:27 [INFO]  raft: Node at 127.0.0.1:18922 [Leader] entering Leader state
TestAgent_unloadServices - 2019/12/30 18:57:27.461590 [INFO] consul: cluster leadership acquired
TestAgent_unloadServices - 2019/12/30 18:57:27.462073 [INFO] consul: New leader elected: Node 11f054ad-81ed-4603-37ee-987d1c01aa42
jones - 2019/12/30 18:57:27.494241 [DEBUG] manager: Rebalanced 1 servers, next active server is Node a8b3e297-b53a-bcd0-efda-5addcd938805.dc1 (Addr: tcp/127.0.0.1:17728) (DC: dc1)
TestAgent_unloadServices - 2019/12/30 18:57:27.553689 [DEBUG] agent: removed service "redis"
TestAgent_unloadServices - 2019/12/30 18:57:27.553791 [INFO] agent: Requesting shutdown
TestAgent_unloadServices - 2019/12/30 18:57:27.553858 [INFO] consul: shutting down server
TestAgent_unloadServices - 2019/12/30 18:57:27.553914 [WARN] serf: Shutdown without a Leave
TestAgent_unloadServices - 2019/12/30 18:57:27.554049 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_unloadServices - 2019/12/30 18:57:27.727640 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:27 [INFO]  raft: Election won. Tally: 1
TestAgent_unloadServices - 2019/12/30 18:57:27.803088 [INFO] manager: shutting down
2019/12/30 18:57:27 [INFO]  raft: Node at 127.0.0.1:18928 [Leader] entering Leader state
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.804488 [INFO] consul: cluster leadership acquired
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.805120 [INFO] consul: New leader elected: Node c06e1a84-48b5-2378-eddd-6bfca7ee9010
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:27.810808 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestAgent_unloadServices - 2019/12/30 18:57:27.929561 [INFO] agent: consul server down
TestAgent_unloadServices - 2019/12/30 18:57:27.929636 [INFO] agent: shutdown complete
TestAgent_unloadServices - 2019/12/30 18:57:27.929843 [INFO] agent: Stopping DNS server 127.0.0.1:18917 (tcp)
TestAgent_unloadServices - 2019/12/30 18:57:27.930096 [INFO] agent: Stopping DNS server 127.0.0.1:18917 (udp)
TestAgent_unloadServices - 2019/12/30 18:57:27.930284 [INFO] agent: Stopping HTTP server 127.0.0.1:18918 (tcp)
TestAgent_unloadServices - 2019/12/30 18:57:27.930513 [INFO] agent: Waiting for endpoints to shut down
TestAgent_unloadServices - 2019/12/30 18:57:27.930584 [INFO] agent: Endpoints down
--- PASS: TestAgent_unloadServices (2.58s)
=== CONT  TestAgent_loadServices_sidecarSeparateToken
TestAgent_unloadServices - 2019/12/30 18:57:27.931352 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:28.015915 [WARN] agent: Node name "Node 234a68d4-df43-6eef-2f69-406757d4fd2d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:28.016519 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:28.019702 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_loadProxies - 2019/12/30 18:57:28.163423 [INFO] agent: Synced service "rabbitmq"
TestAgent_loadProxies - 2019/12/30 18:57:28.163544 [DEBUG] agent: Check "service:rabbitmq-proxy" in sync
TestAgent_loadProxies - 2019/12/30 18:57:28.163584 [DEBUG] agent: Node info in sync
TestAgent_loadProxies - 2019/12/30 18:57:28.163713 [DEBUG] agent: Service "rabbitmq-proxy" in sync
TestAgent_loadProxies - 2019/12/30 18:57:28.163761 [DEBUG] agent: Service "rabbitmq" in sync
TestAgent_loadProxies - 2019/12/30 18:57:28.163806 [DEBUG] agent: Check "service:rabbitmq-proxy" in sync
TestAgent_loadProxies - 2019/12/30 18:57:28.163838 [DEBUG] agent: Node info in sync
TestAgent_loadProxies - 2019/12/30 18:57:28.163953 [INFO] agent: Requesting shutdown
TestAgent_loadProxies - 2019/12/30 18:57:28.164040 [INFO] consul: shutting down server
TestAgent_loadProxies - 2019/12/30 18:57:28.164095 [WARN] serf: Shutdown without a Leave
TestAgent_loadProxies - 2019/12/30 18:57:28.244379 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.250391 [INFO] agent: Synced service "rabbitmq"
2019/12/30 18:57:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6caecb47-feab-2953-2b78-bcc7d53781e4 Address:127.0.0.1:18934}]
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.331730 [INFO] serf: EventMemberJoin: Node 6caecb47-feab-2953-2b78-bcc7d53781e4.dc1 127.0.0.1
TestAgent_loadProxies - 2019/12/30 18:57:28.332978 [INFO] manager: shutting down
TestAgent_loadProxies - 2019/12/30 18:57:28.333402 [INFO] agent: consul server down
TestAgent_loadProxies - 2019/12/30 18:57:28.333448 [INFO] agent: shutdown complete
TestAgent_loadProxies - 2019/12/30 18:57:28.333498 [INFO] agent: Stopping DNS server 127.0.0.1:18911 (tcp)
TestAgent_loadProxies - 2019/12/30 18:57:28.333618 [INFO] agent: Stopping DNS server 127.0.0.1:18911 (udp)
TestAgent_loadProxies - 2019/12/30 18:57:28.333758 [INFO] agent: Stopping HTTP server 127.0.0.1:18912 (tcp)
TestAgent_loadProxies - 2019/12/30 18:57:28.333931 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadProxies - 2019/12/30 18:57:28.333993 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadProxies (3.78s)
=== CONT  TestAgent_loadServices_sidecar
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.335112 [INFO] serf: EventMemberJoin: Node 6caecb47-feab-2953-2b78-bcc7d53781e4 127.0.0.1
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.336588 [INFO] agent: Started DNS server 127.0.0.1:18929 (udp)
2019/12/30 18:57:28 [INFO]  raft: Node at 127.0.0.1:18934 [Follower] entering Follower state (Leader: "")
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.338942 [INFO] consul: Adding LAN server Node 6caecb47-feab-2953-2b78-bcc7d53781e4 (Addr: tcp/127.0.0.1:18934) (DC: dc1)
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.339204 [INFO] consul: Handled member-join event for server "Node 6caecb47-feab-2953-2b78-bcc7d53781e4.dc1" in area "wan"
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.340334 [INFO] agent: Started DNS server 127.0.0.1:18929 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.342808 [INFO] agent: Started HTTP server on 127.0.0.1:18930 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:28.342936 [INFO] agent: started state syncer
TestAgent_loadProxies - 2019/12/30 18:57:28.354243 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestAgent_loadProxies - 2019/12/30 18:57:28.364634 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
2019/12/30 18:57:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:28 [INFO]  raft: Node at 127.0.0.1:18934 [Candidate] entering Candidate state in term 2
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430050 [INFO] agent: Synced service "rabbitmq-sidecar-proxy"
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430159 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:1" in sync
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430203 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:2" in sync
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430234 [DEBUG] agent: Node info in sync
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430695 [INFO] agent: Requesting shutdown
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430824 [INFO] consul: shutting down server
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.430880 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadServices_sidecar - 2019/12/30 18:57:28.470753 [WARN] agent: Node name "Node bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadServices_sidecar - 2019/12/30 18:57:28.471750 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_sidecar - 2019/12/30 18:57:28.475509 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.538882 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.711248 [INFO] manager: shutting down
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.796813 [INFO] agent: consul server down
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.799357 [INFO] agent: shutdown complete
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.798055 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.799972 [INFO] agent: Stopping DNS server 127.0.0.1:18923 (tcp)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.800555 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.801601 [INFO] agent: Stopping DNS server 127.0.0.1:18923 (udp)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.801808 [INFO] agent: Stopping HTTP server 127.0.0.1:18924 (tcp)
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.802019 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadServices_sidecarOverrideMeta - 2019/12/30 18:57:28.802584 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadServices_sidecarOverrideMeta (2.91s)
=== CONT  TestAgent_loadServices_token
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadServices_token - 2019/12/30 18:57:28.890209 [WARN] agent: Node name "Node 52e44c53-a99e-2135-2363-a81eaae603b8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadServices_token - 2019/12/30 18:57:28.891239 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_token - 2019/12/30 18:57:28.901894 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:29 [INFO]  raft: Node at 127.0.0.1:18934 [Leader] entering Leader state
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.065460 [INFO] consul: cluster leadership acquired
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.065882 [INFO] consul: New leader elected: Node 6caecb47-feab-2953-2b78-bcc7d53781e4
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.125732 [INFO] agent: Requesting shutdown
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.125913 [INFO] consul: shutting down server
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.125986 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.126268 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:57:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:234a68d4-df43-6eef-2f69-406757d4fd2d Address:127.0.0.1:18940}]
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.165875 [INFO] serf: EventMemberJoin: Node 234a68d4-df43-6eef-2f69-406757d4fd2d.dc1 127.0.0.1
2019/12/30 18:57:29 [INFO]  raft: Node at 127.0.0.1:18940 [Follower] entering Follower state (Leader: "")
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.169566 [INFO] serf: EventMemberJoin: Node 234a68d4-df43-6eef-2f69-406757d4fd2d 127.0.0.1
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.170594 [INFO] consul: Adding LAN server Node 234a68d4-df43-6eef-2f69-406757d4fd2d (Addr: tcp/127.0.0.1:18940) (DC: dc1)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.170908 [INFO] consul: Handled member-join event for server "Node 234a68d4-df43-6eef-2f69-406757d4fd2d.dc1" in area "wan"
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.173520 [INFO] agent: Started DNS server 127.0.0.1:18935 (tcp)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.179942 [INFO] agent: Started DNS server 127.0.0.1:18935 (udp)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.182519 [INFO] agent: Started HTTP server on 127.0.0.1:18936 (tcp)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.182760 [INFO] agent: started state syncer
2019/12/30 18:57:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:29 [INFO]  raft: Node at 127.0.0.1:18940 [Candidate] entering Candidate state in term 2
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.249576 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.329696 [INFO] manager: shutting down
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.329842 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.337322 [INFO] agent: consul server down
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.337387 [INFO] agent: shutdown complete
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.337476 [INFO] agent: Stopping DNS server 127.0.0.1:18929 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.337610 [INFO] agent: Stopping DNS server 127.0.0.1:18929 (udp)
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.337769 [INFO] agent: Stopping HTTP server 127.0.0.1:18930 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.337960 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadServices_sidecarInheritMeta - 2019/12/30 18:57:29.338028 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadServices_sidecarInheritMeta (2.27s)
=== CONT  TestAgent_unloadChecks
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_unloadChecks - 2019/12/30 18:57:29.476938 [WARN] agent: Node name "Node 3ed28d68-3093-8e86-c21e-7981b2c53794" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_unloadChecks - 2019/12/30 18:57:29.477395 [DEBUG] tlsutil: Update with version 1
TestAgent_unloadChecks - 2019/12/30 18:57:29.479539 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6 Address:127.0.0.1:18946}]
2019/12/30 18:57:29 [INFO]  raft: Node at 127.0.0.1:18946 [Follower] entering Follower state (Leader: "")
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.702751 [INFO] serf: EventMemberJoin: Node bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6.dc1 127.0.0.1
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.707077 [INFO] serf: EventMemberJoin: Node bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6 127.0.0.1
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.708109 [INFO] consul: Adding LAN server Node bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6 (Addr: tcp/127.0.0.1:18946) (DC: dc1)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.708273 [INFO] consul: Handled member-join event for server "Node bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6.dc1" in area "wan"
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.713150 [INFO] agent: Started DNS server 127.0.0.1:18941 (tcp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.714454 [INFO] agent: Started DNS server 127.0.0.1:18941 (udp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.716937 [INFO] agent: Started HTTP server on 127.0.0.1:18942 (tcp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:29.717040 [INFO] agent: started state syncer
2019/12/30 18:57:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:29 [INFO]  raft: Node at 127.0.0.1:18946 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:29 [INFO]  raft: Node at 127.0.0.1:18940 [Leader] entering Leader state
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.938477 [INFO] consul: cluster leadership acquired
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:29.938894 [INFO] consul: New leader elected: Node 234a68d4-df43-6eef-2f69-406757d4fd2d
2019/12/30 18:57:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:52e44c53-a99e-2135-2363-a81eaae603b8 Address:127.0.0.1:18952}]
2019/12/30 18:57:30 [INFO]  raft: Node at 127.0.0.1:18952 [Follower] entering Follower state (Leader: "")
TestAgent_loadServices_token - 2019/12/30 18:57:30.016916 [INFO] serf: EventMemberJoin: Node 52e44c53-a99e-2135-2363-a81eaae603b8.dc1 127.0.0.1
TestAgent_loadServices_token - 2019/12/30 18:57:30.023850 [INFO] serf: EventMemberJoin: Node 52e44c53-a99e-2135-2363-a81eaae603b8 127.0.0.1
TestAgent_loadServices_token - 2019/12/30 18:57:30.025936 [INFO] consul: Adding LAN server Node 52e44c53-a99e-2135-2363-a81eaae603b8 (Addr: tcp/127.0.0.1:18952) (DC: dc1)
TestAgent_loadServices_token - 2019/12/30 18:57:30.027453 [INFO] consul: Handled member-join event for server "Node 52e44c53-a99e-2135-2363-a81eaae603b8.dc1" in area "wan"
TestAgent_loadServices_token - 2019/12/30 18:57:30.028791 [INFO] agent: Started DNS server 127.0.0.1:18947 (tcp)
TestAgent_loadServices_token - 2019/12/30 18:57:30.029483 [INFO] agent: Started DNS server 127.0.0.1:18947 (udp)
TestAgent_loadServices_token - 2019/12/30 18:57:30.031863 [INFO] agent: Started HTTP server on 127.0.0.1:18948 (tcp)
TestAgent_loadServices_token - 2019/12/30 18:57:30.031983 [INFO] agent: started state syncer
2019/12/30 18:57:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:30 [INFO]  raft: Node at 127.0.0.1:18952 [Candidate] entering Candidate state in term 2
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.160969 [ERR] leaf watch error: invalid type for leaf response: <nil>
2019/12/30 18:57:30 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:30 [INFO]  raft: Node at 127.0.0.1:18946 [Leader] entering Leader state
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.279202 [INFO] agent: Synced service "rabbitmq"
TestAgent_loadServices_sidecar - 2019/12/30 18:57:30.282265 [INFO] consul: cluster leadership acquired
TestAgent_loadServices_sidecar - 2019/12/30 18:57:30.282689 [INFO] consul: New leader elected: Node bd0a60a7-f2ec-0b74-19ca-7899aa9e47e6
TestAgent_loadServices_sidecar - 2019/12/30 18:57:30.331856 [ERR] leaf watch error: invalid type for leaf response: <nil>
2019/12/30 18:57:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3ed28d68-3093-8e86-c21e-7981b2c53794 Address:127.0.0.1:18958}]
2019/12/30 18:57:30 [INFO]  raft: Node at 127.0.0.1:18958 [Follower] entering Follower state (Leader: "")
TestAgent_unloadChecks - 2019/12/30 18:57:30.444067 [INFO] serf: EventMemberJoin: Node 3ed28d68-3093-8e86-c21e-7981b2c53794.dc1 127.0.0.1
TestAgent_unloadChecks - 2019/12/30 18:57:30.447511 [INFO] serf: EventMemberJoin: Node 3ed28d68-3093-8e86-c21e-7981b2c53794 127.0.0.1
TestAgent_unloadChecks - 2019/12/30 18:57:30.448250 [INFO] consul: Adding LAN server Node 3ed28d68-3093-8e86-c21e-7981b2c53794 (Addr: tcp/127.0.0.1:18958) (DC: dc1)
TestAgent_unloadChecks - 2019/12/30 18:57:30.448274 [INFO] consul: Handled member-join event for server "Node 3ed28d68-3093-8e86-c21e-7981b2c53794.dc1" in area "wan"
TestAgent_unloadChecks - 2019/12/30 18:57:30.448775 [INFO] agent: Started DNS server 127.0.0.1:18953 (tcp)
TestAgent_unloadChecks - 2019/12/30 18:57:30.448864 [INFO] agent: Started DNS server 127.0.0.1:18953 (udp)
TestAgent_unloadChecks - 2019/12/30 18:57:30.451132 [INFO] agent: Started HTTP server on 127.0.0.1:18954 (tcp)
TestAgent_unloadChecks - 2019/12/30 18:57:30.451248 [INFO] agent: started state syncer
2019/12/30 18:57:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:30 [INFO]  raft: Node at 127.0.0.1:18958 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:30 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:30 [INFO]  raft: Node at 127.0.0.1:18952 [Leader] entering Leader state
TestAgent_loadServices_token - 2019/12/30 18:57:30.720755 [INFO] consul: cluster leadership acquired
TestAgent_loadServices_token - 2019/12/30 18:57:30.721288 [INFO] consul: New leader elected: Node 52e44c53-a99e-2135-2363-a81eaae603b8
TestAgent_loadServices_sidecar - 2019/12/30 18:57:30.723807 [INFO] agent: Synced service "rabbitmq"
TestAgent_loadServices_token - 2019/12/30 18:57:30.784495 [INFO] agent: Requesting shutdown
TestAgent_loadServices_token - 2019/12/30 18:57:30.784610 [INFO] consul: shutting down server
TestAgent_loadServices_token - 2019/12/30 18:57:30.784661 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_token - 2019/12/30 18:57:30.784983 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907123 [INFO] agent: Synced service "rabbitmq-sidecar-proxy"
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907220 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:1" in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907273 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:2" in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907319 [DEBUG] agent: Node info in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907438 [DEBUG] agent: Service "rabbitmq" in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907489 [DEBUG] agent: Service "rabbitmq-sidecar-proxy" in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907534 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:1" in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907574 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:2" in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907612 [DEBUG] agent: Node info in sync
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907709 [INFO] agent: Requesting shutdown
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907824 [INFO] consul: shutting down server
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:30.907870 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_token - 2019/12/30 18:57:30.909996 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.002747 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_token - 2019/12/30 18:57:31.003683 [INFO] manager: shutting down
2019/12/30 18:57:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:31 [INFO]  raft: Node at 127.0.0.1:18958 [Leader] entering Leader state
TestAgent_loadServices_token - 2019/12/30 18:57:31.078132 [INFO] agent: consul server down
TestAgent_loadServices_token - 2019/12/30 18:57:31.078193 [INFO] agent: shutdown complete
TestAgent_loadServices_token - 2019/12/30 18:57:31.078306 [INFO] agent: Stopping DNS server 127.0.0.1:18947 (tcp)
TestAgent_loadServices_token - 2019/12/30 18:57:31.078511 [INFO] agent: Stopping DNS server 127.0.0.1:18947 (udp)
TestAgent_loadServices_token - 2019/12/30 18:57:31.078707 [INFO] agent: Stopping HTTP server 127.0.0.1:18948 (tcp)
TestAgent_loadServices_token - 2019/12/30 18:57:31.078975 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadServices_token - 2019/12/30 18:57:31.079076 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadServices_token (2.28s)
=== CONT  TestAgent_loadChecks_token
TestAgent_unloadChecks - 2019/12/30 18:57:31.079844 [INFO] consul: cluster leadership acquired
TestAgent_unloadChecks - 2019/12/30 18:57:31.080329 [INFO] consul: New leader elected: Node 3ed28d68-3093-8e86-c21e-7981b2c53794
TestAgent_loadServices_token - 2019/12/30 18:57:31.084299 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.085635 [INFO] manager: shutting down
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086066 [INFO] agent: consul server down
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086119 [INFO] agent: shutdown complete
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086169 [INFO] agent: Stopping DNS server 127.0.0.1:18935 (tcp)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086309 [INFO] agent: Stopping DNS server 127.0.0.1:18935 (udp)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086480 [INFO] agent: Stopping HTTP server 127.0.0.1:18936 (tcp)
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086691 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.086763 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadServices_sidecarSeparateToken (3.16s)
=== CONT  TestAgent_PurgeCheckOnDuplicate
TestAgent_unloadChecks - 2019/12/30 18:57:31.116884 [DEBUG] agent: removed check "service:redis"
TestAgent_unloadChecks - 2019/12/30 18:57:31.116988 [INFO] agent: Requesting shutdown
TestAgent_unloadChecks - 2019/12/30 18:57:31.117044 [INFO] consul: shutting down server
TestAgent_unloadChecks - 2019/12/30 18:57:31.117090 [WARN] serf: Shutdown without a Leave
TestAgent_unloadChecks - 2019/12/30 18:57:31.117489 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.147855 [INFO] agent: Synced service "rabbitmq-sidecar-proxy"
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.148185 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:1" in sync
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.148339 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:2" in sync
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.148461 [DEBUG] agent: Node info in sync
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.148726 [INFO] agent: Requesting shutdown
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.148966 [INFO] consul: shutting down server
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.149114 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarSeparateToken - 2019/12/30 18:57:31.170423 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestAgent_unloadChecks - 2019/12/30 18:57:31.227710 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.255384 [WARN] agent: Node name "Node 1707e1e0-f1fa-cbf1-ba9d-e69c5d3ab0ae" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.255819 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.258861 [INFO] serf: EventMemberJoin: Node 1707e1e0-f1fa-cbf1-ba9d-e69c5d3ab0ae 127.0.0.1
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.270024 [INFO] agent: Started DNS server 127.0.0.1:18965 (tcp)
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.270135 [INFO] agent: Started DNS server 127.0.0.1:18965 (udp)
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.272487 [INFO] agent: Started HTTP server on 127.0.0.1:18966 (tcp)
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.272598 [INFO] agent: started state syncer
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.272828 [WARN] manager: No servers available
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.272894 [ERR] agent: failed to sync remote state: No known Consul servers
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadChecks_token - 2019/12/30 18:57:31.291785 [WARN] agent: Node name "Node 7c4f133d-d0a2-f0fd-2219-ceafde657a5c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadChecks_token - 2019/12/30 18:57:31.294773 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.302664 [WARN] serf: Shutdown without a Leave
TestAgent_unloadChecks - 2019/12/30 18:57:31.302747 [INFO] manager: shutting down
TestAgent_unloadChecks - 2019/12/30 18:57:31.302898 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestAgent_unloadChecks - 2019/12/30 18:57:31.305463 [INFO] agent: consul server down
TestAgent_unloadChecks - 2019/12/30 18:57:31.305532 [INFO] agent: shutdown complete
TestAgent_unloadChecks - 2019/12/30 18:57:31.305587 [INFO] agent: Stopping DNS server 127.0.0.1:18953 (tcp)
TestAgent_unloadChecks - 2019/12/30 18:57:31.305737 [INFO] agent: Stopping DNS server 127.0.0.1:18953 (udp)
TestAgent_unloadChecks - 2019/12/30 18:57:31.305897 [INFO] agent: Stopping HTTP server 127.0.0.1:18954 (tcp)
TestAgent_unloadChecks - 2019/12/30 18:57:31.306096 [INFO] agent: Waiting for endpoints to shut down
TestAgent_unloadChecks - 2019/12/30 18:57:31.306169 [INFO] agent: Endpoints down
--- PASS: TestAgent_unloadChecks (1.97s)
=== CONT  TestAgent_PersistCheck
TestAgent_loadChecks_token - 2019/12/30 18:57:31.308677 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:31.378374 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:31.378470 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/30 18:57:31.378506 [DEBUG] agent: Node info in sync
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.395504 [INFO] agent: Requesting shutdown
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.395597 [INFO] consul: shutting down client
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.395656 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.395971 [INFO] manager: shutting down
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.477717 [INFO] agent: consul client down
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.477776 [INFO] manager: shutting down
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.477799 [INFO] agent: shutdown complete
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.477877 [INFO] agent: Stopping DNS server 127.0.0.1:18965 (tcp)
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.478067 [INFO] agent: Stopping DNS server 127.0.0.1:18965 (udp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478230 [INFO] agent: consul server down
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.478241 [INFO] agent: Stopping HTTP server 127.0.0.1:18966 (tcp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478279 [INFO] agent: shutdown complete
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478335 [INFO] agent: Stopping DNS server 127.0.0.1:18941 (tcp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478466 [INFO] agent: Stopping DNS server 127.0.0.1:18941 (udp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478651 [INFO] agent: Stopping HTTP server 127.0.0.1:18942 (tcp)
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478736 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478871 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.478474 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadServices_sidecar - 2019/12/30 18:57:31.478934 [INFO] agent: Endpoints down
TestAgent_PurgeCheckOnDuplicate - 2019/12/30 18:57:31.478956 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadServices_sidecar (3.14s)
=== CONT  TestAgent_PurgeProxyOnDuplicate
TestAgent_PersistCheck - 2019/12/30 18:57:31.489959 [WARN] agent: Node name "Node 43969c79-44cf-cc06-652c-eedfab50de00" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PersistCheck - 2019/12/30 18:57:31.492494 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistCheck - 2019/12/30 18:57:31.508304 [INFO] serf: EventMemberJoin: Node 43969c79-44cf-cc06-652c-eedfab50de00 127.0.0.1
TestAgent_PersistCheck - 2019/12/30 18:57:31.509558 [INFO] agent: Started DNS server 127.0.0.1:18971 (udp)
TestAgent_PersistCheck - 2019/12/30 18:57:31.510136 [INFO] agent: Started DNS server 127.0.0.1:18971 (tcp)
TestAgent_PersistCheck - 2019/12/30 18:57:31.519149 [INFO] agent: Started HTTP server on 127.0.0.1:18972 (tcp)
TestAgent_PersistCheck - 2019/12/30 18:57:31.519250 [INFO] agent: started state syncer
TestAgent_PersistCheck - 2019/12/30 18:57:31.524964 [WARN] manager: No servers available
TestAgent_PersistCheck - 2019/12/30 18:57:31.525039 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.613406 [WARN] agent: Node name "Node 1707e1e0-f1fa-cbf1-ba9d-e69c5d3ab0ae" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.613873 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.626209 [WARN] agent: Node name "Node 2634e341-340d-7a1e-f522-d51510f8ef9a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.626800 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.630113 [INFO] serf: EventMemberJoin: Node 2634e341-340d-7a1e-f522-d51510f8ef9a 127.0.0.1
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.631332 [INFO] agent: Started DNS server 127.0.0.1:18977 (udp)
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.631523 [INFO] agent: Started DNS server 127.0.0.1:18977 (tcp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.632182 [INFO] serf: EventMemberJoin: Node 1707e1e0-f1fa-cbf1-ba9d-e69c5d3ab0ae 127.0.0.1
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.633207 [INFO] agent: Started DNS server 127.0.0.1:18983 (udp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.633279 [INFO] agent: Started DNS server 127.0.0.1:18983 (tcp)
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.634906 [INFO] agent: Started HTTP server on 127.0.0.1:18978 (tcp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.635748 [INFO] agent: Started HTTP server on 127.0.0.1:18984 (tcp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.638606 [INFO] agent: started state syncer
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.638872 [INFO] agent: started state syncer
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.641398 [INFO] agent: Requesting shutdown
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.641490 [INFO] consul: shutting down client
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.641530 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.638948 [WARN] manager: No servers available
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.641880 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.642002 [INFO] manager: shutting down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.643961 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.644261 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.744495 [INFO] agent: consul client down
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.744577 [INFO] agent: shutdown complete
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.744638 [INFO] agent: Stopping DNS server 127.0.0.1:18983 (tcp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.744860 [INFO] agent: Stopping DNS server 127.0.0.1:18983 (udp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.745068 [INFO] agent: Stopping HTTP server 127.0.0.1:18984 (tcp)
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.745305 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeCheckOnDuplicate-a2 - 2019/12/30 18:57:31.745378 [INFO] agent: Endpoints down
--- PASS: TestAgent_PurgeCheckOnDuplicate (0.66s)
=== CONT  TestAgent_PurgeProxy
TestAgent_PersistCheck - 2019/12/30 18:57:31.749078 [INFO] agent: Requesting shutdown
TestAgent_PersistCheck - 2019/12/30 18:57:31.749180 [INFO] consul: shutting down client
TestAgent_PersistCheck - 2019/12/30 18:57:31.749221 [WARN] serf: Shutdown without a Leave
TestAgent_PersistCheck - 2019/12/30 18:57:31.749585 [INFO] manager: shutting down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.754607 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.754791 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.754834 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755014 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755130 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755281 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755424 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755532 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755580 [ERR] roots watch error: invalid type for roots response: <nil>
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755679 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755853 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.755963 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756079 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756104 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756190 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756281 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756558 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756604 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756714 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.756874 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.757047 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.757047 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.757223 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.757502 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.757757 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.757969 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.758159 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.758349 [WARN] manager: No servers available
TestAgent_PersistCheck - 2019/12/30 18:57:31.836114 [INFO] agent: consul client down
TestAgent_PersistCheck - 2019/12/30 18:57:31.836196 [INFO] agent: shutdown complete
TestAgent_PersistCheck - 2019/12/30 18:57:31.836257 [INFO] agent: Stopping DNS server 127.0.0.1:18971 (tcp)
TestAgent_PersistCheck - 2019/12/30 18:57:31.836426 [INFO] agent: Stopping DNS server 127.0.0.1:18971 (udp)
TestAgent_PersistCheck - 2019/12/30 18:57:31.836579 [INFO] agent: Stopping HTTP server 127.0.0.1:18972 (tcp)
TestAgent_PersistCheck - 2019/12/30 18:57:31.836775 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PersistCheck - 2019/12/30 18:57:31.836848 [INFO] agent: Endpoints down
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_PurgeProxy - 2019/12/30 18:57:31.892935 [WARN] agent: Node name "Node 164f5643-279d-c6b7-679e-e79dc4c4c6e9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeProxy - 2019/12/30 18:57:31.893486 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeProxy - 2019/12/30 18:57:31.897683 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:31.958949 [DEBUG] manager: Rebalanced 1 servers, next active server is Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786.dc1 (Addr: tcp/127.0.0.1:17692) (DC: dc1)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.979025 [WARN] agent: Node name "Node 1e10de38-cd39-adde-3774-17c1cad1105f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.979527 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.982452 [INFO] serf: EventMemberJoin: Node 1e10de38-cd39-adde-3774-17c1cad1105f 127.0.0.1
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.983216 [INFO] serf: Attempting re-join to previously known node: Node 43969c79-44cf-cc06-652c-eedfab50de00: 127.0.0.1:18974
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.984056 [DEBUG] memberlist: Failed to join 127.0.0.1: dial tcp 127.0.0.1:18974: connect: connection refused
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.984112 [WARN] serf: Failed to re-join any previously known node
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.989287 [DEBUG] agent: restored health check "mem" from "/tmp/consul-test/TestAgent_PersistCheck-agent157732258/checks/afc4fc7e48a0710a1dc94ef3e8bc5764"
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.990197 [INFO] agent: Started DNS server 127.0.0.1:18995 (tcp)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.990517 [INFO] agent: Started DNS server 127.0.0.1:18995 (udp)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.993252 [INFO] agent: Started HTTP server on 127.0.0.1:18996 (tcp)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.993375 [INFO] agent: started state syncer
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.993606 [WARN] manager: No servers available
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:31.993671 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.996365 [INFO] agent: Requesting shutdown
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.996492 [INFO] consul: shutting down client
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.996533 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:31.996886 [INFO] manager: shutting down
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.001060 [INFO] agent: Requesting shutdown
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.001160 [INFO] consul: shutting down client
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.001209 [WARN] serf: Shutdown without a Leave
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.001568 [INFO] manager: shutting down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.077839 [INFO] agent: consul client down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.077929 [INFO] agent: shutdown complete
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.078001 [INFO] agent: Stopping DNS server 127.0.0.1:18977 (tcp)
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.078188 [INFO] agent: Stopping DNS server 127.0.0.1:18977 (udp)
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.078352 [INFO] agent: Stopping HTTP server 127.0.0.1:18978 (tcp)
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.078559 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:32.078624 [INFO] agent: Endpoints down
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.135491 [WARN] agent: Node name "Node 862c3655-746a-a52d-b5af-e261725a3baf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.136919 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138199 [INFO] agent: consul client down
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138275 [INFO] agent: shutdown complete
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138339 [INFO] agent: Stopping DNS server 127.0.0.1:18995 (tcp)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138503 [INFO] agent: Stopping DNS server 127.0.0.1:18995 (udp)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138720 [INFO] agent: Stopping HTTP server 127.0.0.1:18996 (tcp)
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138914 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PersistCheck-a2 - 2019/12/30 18:57:32.138980 [INFO] agent: Endpoints down
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.140776 [INFO] serf: EventMemberJoin: Node 862c3655-746a-a52d-b5af-e261725a3baf 127.0.0.1
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.141334 [INFO] serf: Attempting re-join to previously known node: Node 2634e341-340d-7a1e-f522-d51510f8ef9a: 127.0.0.1:18980
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.142040 [DEBUG] agent: restored service definition "redis-proxy" from "/tmp/consul-test/TestAgent_PurgeProxyOnDuplicate-agent152631588/services/6ef0f40bebf8c49e46e72d22ec645f5d"
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.142519 [DEBUG] agent: service "redis" exists, not restoring from "/tmp/consul-test/TestAgent_PurgeProxyOnDuplicate-agent152631588/services/86a1b907d54bf7010394bf316e183e67"
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.144952 [DEBUG] memberlist: Failed to join 127.0.0.1: dial tcp 127.0.0.1:18980: connect: connection refused
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.145022 [WARN] serf: Failed to re-join any previously known node
--- PASS: TestAgent_PersistCheck (0.84s)
=== CONT  TestAgent_PersistProxy
TestAgent_PersistProxy - 2019/12/30 18:57:32.234218 [WARN] agent: Node name "Node 56945cee-6dd0-189e-a4ba-8a248c4cbae3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PersistProxy - 2019/12/30 18:57:32.234674 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistProxy - 2019/12/30 18:57:32.238122 [INFO] serf: EventMemberJoin: Node 56945cee-6dd0-189e-a4ba-8a248c4cbae3 127.0.0.1
TestAgent_PersistProxy - 2019/12/30 18:57:32.239101 [INFO] agent: Started DNS server 127.0.0.1:17508 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:32.239188 [INFO] agent: Started DNS server 127.0.0.1:17508 (udp)
TestAgent_PersistProxy - 2019/12/30 18:57:32.241758 [INFO] agent: Started HTTP server on 127.0.0.1:17509 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:32.242068 [INFO] agent: started state syncer
TestAgent_PersistProxy - 2019/12/30 18:57:32.242243 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.242311 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PersistProxy - 2019/12/30 18:57:32.343586 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.343813 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.343952 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.344086 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.344617 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.344785 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.344920 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.344963 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345056 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345259 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345388 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345451 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345580 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345758 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.345907 [ERR] roots watch error: invalid type for roots response: <nil>
TestAgent_PersistProxy - 2019/12/30 18:57:32.345603 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346097 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346213 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346067 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestAgent_PersistProxy - 2019/12/30 18:57:32.346395 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346430 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346536 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346687 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346816 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.346918 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.347097 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.347241 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.347411 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.347562 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:32.347728 [WARN] manager: No servers available
2019/12/30 18:57:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7c4f133d-d0a2-f0fd-2219-ceafde657a5c Address:127.0.0.1:18964}]
2019/12/30 18:57:32 [INFO]  raft: Node at 127.0.0.1:18964 [Follower] entering Follower state (Leader: "")
TestAgent_loadChecks_token - 2019/12/30 18:57:32.440269 [INFO] serf: EventMemberJoin: Node 7c4f133d-d0a2-f0fd-2219-ceafde657a5c.dc1 127.0.0.1
TestAgent_loadChecks_token - 2019/12/30 18:57:32.444196 [INFO] serf: EventMemberJoin: Node 7c4f133d-d0a2-f0fd-2219-ceafde657a5c 127.0.0.1
TestAgent_loadChecks_token - 2019/12/30 18:57:32.445048 [INFO] consul: Adding LAN server Node 7c4f133d-d0a2-f0fd-2219-ceafde657a5c (Addr: tcp/127.0.0.1:18964) (DC: dc1)
TestAgent_loadChecks_token - 2019/12/30 18:57:32.445299 [INFO] consul: Handled member-join event for server "Node 7c4f133d-d0a2-f0fd-2219-ceafde657a5c.dc1" in area "wan"
TestAgent_loadChecks_token - 2019/12/30 18:57:32.446221 [INFO] agent: Started DNS server 127.0.0.1:18959 (udp)
TestAgent_loadChecks_token - 2019/12/30 18:57:32.446306 [INFO] agent: Started DNS server 127.0.0.1:18959 (tcp)
TestAgent_loadChecks_token - 2019/12/30 18:57:32.448962 [INFO] agent: Started HTTP server on 127.0.0.1:18960 (tcp)
TestAgent_loadChecks_token - 2019/12/30 18:57:32.449074 [INFO] agent: started state syncer
2019/12/30 18:57:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:32 [INFO]  raft: Node at 127.0.0.1:18964 [Candidate] entering Candidate state in term 2
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.573295 [WARN] agent: proxy definition "redis-proxy" was overwritten by a proxy definition within a config file
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.574302 [DEBUG] agent: check "service:redis-proxy" exists, not restoring from "/tmp/consul-test/TestAgent_PurgeProxyOnDuplicate-agent152631588/checks/3ed6dc1c512635ec3e32273c592ab387"
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.576100 [INFO] agent: Started DNS server 127.0.0.1:17502 (udp)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.576469 [INFO] agent: Started DNS server 127.0.0.1:17502 (tcp)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.577142 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.577407 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.577624 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.577803 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.578090 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.578281 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.578453 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.578628 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.578876 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.578956 [INFO] agent: Started HTTP server on 127.0.0.1:17503 (tcp)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.579020 [INFO] agent: started state syncer
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.579075 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.579287 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.576471 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.580856 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.581009 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.581146 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.581515 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.581709 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.581857 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582000 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582190 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582349 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582419 [INFO] agent: Requesting shutdown
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582490 [INFO] consul: shutting down client
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582532 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582580 [ERR] roots watch error: invalid type for roots response: <nil>
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582775 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582826 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582859 [INFO] manager: shutting down
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.582498 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.583650 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.661253 [INFO] agent: consul client down
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.661348 [INFO] agent: shutdown complete
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.661415 [INFO] agent: Stopping DNS server 127.0.0.1:17502 (tcp)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.661637 [INFO] agent: Stopping DNS server 127.0.0.1:17502 (udp)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.661823 [INFO] agent: Stopping HTTP server 127.0.0.1:17503 (tcp)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.662053 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:32.662129 [INFO] agent: Endpoints down
--- PASS: TestAgent_PurgeProxyOnDuplicate (1.22s)
=== CONT  TestAgent_PurgeServiceOnDuplicate
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.878126 [WARN] agent: Node name "Node 086d5e64-399b-81bd-3ce7-7aad5c59b84f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.878623 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.882946 [INFO] serf: EventMemberJoin: Node 086d5e64-399b-81bd-3ce7-7aad5c59b84f 127.0.0.1
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.884309 [INFO] agent: Started DNS server 127.0.0.1:17514 (udp)
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.885742 [INFO] agent: Started DNS server 127.0.0.1:17514 (tcp)
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.890593 [INFO] agent: Started HTTP server on 127.0.0.1:17515 (tcp)
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.891025 [INFO] agent: started state syncer
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.891224 [WARN] manager: No servers available
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:32.891295 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.079648 [INFO] agent: Requesting shutdown
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.079821 [INFO] consul: shutting down client
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.079883 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.080290 [INFO] manager: shutting down
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.152895 [INFO] agent: consul client down
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.153000 [INFO] agent: shutdown complete
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.153080 [INFO] agent: Stopping DNS server 127.0.0.1:17514 (tcp)
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.153283 [INFO] agent: Stopping DNS server 127.0.0.1:17514 (udp)
2019/12/30 18:57:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:164f5643-279d-c6b7-679e-e79dc4c4c6e9 Address:127.0.0.1:18994}]
2019/12/30 18:57:33 [INFO]  raft: Node at 127.0.0.1:18994 [Follower] entering Follower state (Leader: "")
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.153481 [INFO] agent: Stopping HTTP server 127.0.0.1:17515 (tcp)
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.154104 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeServiceOnDuplicate - 2019/12/30 18:57:33.154183 [INFO] agent: Endpoints down
TestAgent_PersistProxy - 2019/12/30 18:57:33.156155 [INFO] agent: Requesting shutdown
TestAgent_PersistProxy - 2019/12/30 18:57:33.156270 [INFO] consul: shutting down client
TestAgent_PersistProxy - 2019/12/30 18:57:33.156315 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeProxy - 2019/12/30 18:57:33.156861 [INFO] serf: EventMemberJoin: Node 164f5643-279d-c6b7-679e-e79dc4c4c6e9.dc1 127.0.0.1
TestAgent_PersistProxy - 2019/12/30 18:57:33.156692 [INFO] manager: shutting down
TestAgent_PurgeProxy - 2019/12/30 18:57:33.162019 [INFO] serf: EventMemberJoin: Node 164f5643-279d-c6b7-679e-e79dc4c4c6e9 127.0.0.1
TestAgent_PurgeProxy - 2019/12/30 18:57:33.163077 [INFO] consul: Adding LAN server Node 164f5643-279d-c6b7-679e-e79dc4c4c6e9 (Addr: tcp/127.0.0.1:18994) (DC: dc1)
TestAgent_PurgeProxy - 2019/12/30 18:57:33.163323 [INFO] consul: Handled member-join event for server "Node 164f5643-279d-c6b7-679e-e79dc4c4c6e9.dc1" in area "wan"
TestAgent_PurgeProxy - 2019/12/30 18:57:33.164521 [INFO] agent: Started DNS server 127.0.0.1:18989 (tcp)
TestAgent_PurgeProxy - 2019/12/30 18:57:33.165318 [INFO] agent: Started DNS server 127.0.0.1:18989 (udp)
TestAgent_PurgeProxy - 2019/12/30 18:57:33.167687 [INFO] agent: Started HTTP server on 127.0.0.1:18990 (tcp)
TestAgent_PurgeProxy - 2019/12/30 18:57:33.167798 [INFO] agent: started state syncer
2019/12/30 18:57:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:33 [INFO]  raft: Node at 127.0.0.1:18994 [Candidate] entering Candidate state in term 2
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.224323 [WARN] agent: Node name "Node 17dd94e9-501d-9442-9721-001e28ccfebd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.225017 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.228740 [INFO] serf: EventMemberJoin: Node 17dd94e9-501d-9442-9721-001e28ccfebd 127.0.0.1
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.229194 [INFO] serf: Attempting re-join to previously known node: Node 086d5e64-399b-81bd-3ce7-7aad5c59b84f: 127.0.0.1:17517
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.230210 [DEBUG] memberlist: Failed to join 127.0.0.1: dial tcp 127.0.0.1:17517: connect: connection refused
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.230280 [WARN] serf: Failed to re-join any previously known node
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.232135 [DEBUG] agent: service "redis" exists, not restoring from "/tmp/consul-test/TestAgent_PurgeServiceOnDuplicate-agent440641757/services/86a1b907d54bf7010394bf316e183e67"
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.234055 [INFO] agent: Started DNS server 127.0.0.1:17520 (tcp)
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.234356 [INFO] agent: Started DNS server 127.0.0.1:17520 (udp)
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.237075 [INFO] agent: Started HTTP server on 127.0.0.1:17521 (tcp)
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.237229 [INFO] agent: started state syncer
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.237374 [WARN] manager: No servers available
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.237445 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.240093 [INFO] agent: Requesting shutdown
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.240192 [INFO] consul: shutting down client
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.240239 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.240399 [INFO] manager: shutting down
TestAgent_PersistProxy - 2019/12/30 18:57:33.269716 [INFO] agent: consul client down
TestAgent_PersistProxy - 2019/12/30 18:57:33.269815 [INFO] agent: shutdown complete
TestAgent_PersistProxy - 2019/12/30 18:57:33.269885 [INFO] agent: Stopping DNS server 127.0.0.1:17508 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.270125 [INFO] agent: Stopping DNS server 127.0.0.1:17508 (udp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.270361 [INFO] agent: Stopping HTTP server 127.0.0.1:17509 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.270630 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PersistProxy - 2019/12/30 18:57:33.270726 [INFO] agent: Endpoints down
TestAgent_PersistProxy - 2019/12/30 18:57:33.340235 [WARN] agent: Node name "Node ad2067c9-7c11-fd75-0b1d-3a055045f48a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PersistProxy - 2019/12/30 18:57:33.340928 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistProxy - 2019/12/30 18:57:33.346325 [INFO] serf: EventMemberJoin: Node ad2067c9-7c11-fd75-0b1d-3a055045f48a 127.0.0.1
TestAgent_PersistProxy - 2019/12/30 18:57:33.347715 [INFO] serf: Attempting re-join to previously known node: Node 56945cee-6dd0-189e-a4ba-8a248c4cbae3: 127.0.0.1:17511
TestAgent_PersistProxy - 2019/12/30 18:57:33.348735 [DEBUG] memberlist: Failed to join 127.0.0.1: dial tcp 127.0.0.1:17511: connect: connection refused
TestAgent_PersistProxy - 2019/12/30 18:57:33.348810 [WARN] serf: Failed to re-join any previously known node
TestAgent_PersistProxy - 2019/12/30 18:57:33.350021 [DEBUG] agent: restored service definition "redis-proxy" from "/tmp/consul-test/TestAgent_PersistProxy-agent773678582/services/6ef0f40bebf8c49e46e72d22ec645f5d"
TestAgent_PersistProxy - 2019/12/30 18:57:33.350601 [DEBUG] agent: restored service definition "redis" from "/tmp/consul-test/TestAgent_PersistProxy-agent773678582/services/86a1b907d54bf7010394bf316e183e67"
TestAgent_PersistProxy - 2019/12/30 18:57:33.351563 [DEBUG] agent: restored proxy definition "redis-proxy"
TestAgent_PersistProxy - 2019/12/30 18:57:33.353773 [DEBUG] agent: check "service:redis-proxy" exists, not restoring from "/tmp/consul-test/TestAgent_PersistProxy-agent773678582/checks/3ed6dc1c512635ec3e32273c592ab387"
TestAgent_PersistProxy - 2019/12/30 18:57:33.356033 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356080 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356312 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356456 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356504 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356607 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356772 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356940 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357161 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357327 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357372 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357478 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357599 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357800 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357617 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.357990 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358171 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358286 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358416 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.356798 [ERR] roots watch error: invalid type for roots response: <nil>
TestAgent_PersistProxy - 2019/12/30 18:57:33.358524 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestAgent_PersistProxy - 2019/12/30 18:57:33.358613 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358442 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358796 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358833 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358971 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.358986 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.359619 [INFO] agent: Started DNS server 127.0.0.1:17526 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.359916 [INFO] agent: Started DNS server 127.0.0.1:17526 (udp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.362731 [INFO] agent: Started HTTP server on 127.0.0.1:17527 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.362863 [INFO] agent: started state syncer
TestAgent_PersistProxy - 2019/12/30 18:57:33.363337 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:33.363411 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PersistProxy - 2019/12/30 18:57:33.365761 [INFO] agent: Requesting shutdown
TestAgent_PersistProxy - 2019/12/30 18:57:33.365892 [INFO] consul: shutting down client
TestAgent_PersistProxy - 2019/12/30 18:57:33.366017 [WARN] serf: Shutdown without a Leave
TestAgent_PersistProxy - 2019/12/30 18:57:33.366534 [INFO] manager: shutting down
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.369564 [INFO] agent: consul client down
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.369656 [INFO] agent: shutdown complete
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.369720 [INFO] agent: Stopping DNS server 127.0.0.1:17520 (tcp)
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.369943 [INFO] agent: Stopping DNS server 127.0.0.1:17520 (udp)
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.370206 [INFO] agent: Stopping HTTP server 127.0.0.1:17521 (tcp)
2019/12/30 18:57:33 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:33 [INFO]  raft: Node at 127.0.0.1:18964 [Leader] entering Leader state
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.370471 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeServiceOnDuplicate-a2 - 2019/12/30 18:57:33.370648 [INFO] agent: Endpoints down
TestAgent_loadChecks_token - 2019/12/30 18:57:33.370774 [INFO] consul: cluster leadership acquired
TestAgent_loadChecks_token - 2019/12/30 18:57:33.371259 [INFO] consul: New leader elected: Node 7c4f133d-d0a2-f0fd-2219-ceafde657a5c
--- PASS: TestAgent_PurgeServiceOnDuplicate (0.68s)
=== CONT  TestAgent_PurgeService
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_PurgeService - 2019/12/30 18:57:33.433498 [WARN] agent: Node name "Node 1b1a8b5d-270d-a79e-2889-1eb27500969f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PurgeService - 2019/12/30 18:57:33.433975 [DEBUG] tlsutil: Update with version 1
TestAgent_PurgeService - 2019/12/30 18:57:33.436408 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_PersistProxy - 2019/12/30 18:57:33.452778 [INFO] agent: consul client down
TestAgent_PersistProxy - 2019/12/30 18:57:33.452890 [INFO] agent: shutdown complete
TestAgent_PersistProxy - 2019/12/30 18:57:33.452973 [INFO] agent: Stopping DNS server 127.0.0.1:17526 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.453190 [INFO] agent: Stopping DNS server 127.0.0.1:17526 (udp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.453392 [INFO] agent: Stopping HTTP server 127.0.0.1:17527 (tcp)
TestAgent_PersistProxy - 2019/12/30 18:57:33.453644 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PersistProxy - 2019/12/30 18:57:33.453777 [INFO] agent: Endpoints down
=== CONT  TestAgent_persistedService_compat
--- PASS: TestAgent_PersistProxy (1.32s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_persistedService_compat - 2019/12/30 18:57:33.540360 [WARN] agent: Node name "Node b6218bff-3840-427d-224e-0af870e363bd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_persistedService_compat - 2019/12/30 18:57:33.541096 [DEBUG] tlsutil: Update with version 1
TestAgent_persistedService_compat - 2019/12/30 18:57:33.544071 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_loadChecks_token - 2019/12/30 18:57:33.907774 [INFO] agent: Synced check "rabbitmq"
TestAgent_loadChecks_token - 2019/12/30 18:57:33.907864 [DEBUG] agent: Node info in sync
TestAgent_loadChecks_token - 2019/12/30 18:57:33.908038 [INFO] agent: Requesting shutdown
TestAgent_loadChecks_token - 2019/12/30 18:57:33.908115 [INFO] consul: shutting down server
TestAgent_loadChecks_token - 2019/12/30 18:57:33.908160 [WARN] serf: Shutdown without a Leave
TestAgent_loadChecks_token - 2019/12/30 18:57:33.994777 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:33 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:33 [INFO]  raft: Node at 127.0.0.1:18994 [Leader] entering Leader state
TestAgent_PurgeProxy - 2019/12/30 18:57:33.997749 [INFO] consul: cluster leadership acquired
TestAgent_PurgeProxy - 2019/12/30 18:57:33.998169 [INFO] consul: New leader elected: Node 164f5643-279d-c6b7-679e-e79dc4c4c6e9
TestAgent_loadChecks_token - 2019/12/30 18:57:34.119791 [INFO] manager: shutting down
TestAgent_loadChecks_token - 2019/12/30 18:57:34.120250 [INFO] agent: consul server down
TestAgent_loadChecks_token - 2019/12/30 18:57:34.120320 [INFO] agent: shutdown complete
TestAgent_loadChecks_token - 2019/12/30 18:57:34.120391 [INFO] agent: Stopping DNS server 127.0.0.1:18959 (tcp)
TestAgent_loadChecks_token - 2019/12/30 18:57:34.120570 [INFO] agent: Stopping DNS server 127.0.0.1:18959 (udp)
TestAgent_loadChecks_token - 2019/12/30 18:57:34.120848 [INFO] agent: Stopping HTTP server 127.0.0.1:18960 (tcp)
TestAgent_loadChecks_token - 2019/12/30 18:57:34.121153 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadChecks_token - 2019/12/30 18:57:34.121245 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadChecks_token (3.04s)
=== CONT  TestAgent_PersistService
TestAgent_PurgeProxy - 2019/12/30 18:57:34.225599 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestAgent_PersistService - 2019/12/30 18:57:34.257831 [WARN] agent: Node name "Node 7c804835-f75e-5b96-76cf-8f0666462101" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PersistService - 2019/12/30 18:57:34.258481 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistService - 2019/12/30 18:57:34.268519 [INFO] serf: EventMemberJoin: Node 7c804835-f75e-5b96-76cf-8f0666462101 127.0.0.1
TestAgent_PersistService - 2019/12/30 18:57:34.269767 [INFO] agent: Started DNS server 127.0.0.1:17544 (udp)
TestAgent_PersistService - 2019/12/30 18:57:34.269856 [INFO] agent: Started DNS server 127.0.0.1:17544 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.272248 [INFO] agent: Started HTTP server on 127.0.0.1:17545 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.272354 [INFO] agent: started state syncer
TestAgent_PersistService - 2019/12/30 18:57:34.272758 [WARN] manager: No servers available
TestAgent_PersistService - 2019/12/30 18:57:34.272829 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:34.341069 [WARN] manager: No servers available
jones - 2019/12/30 18:57:34.343263 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d.dc1 (Addr: tcp/127.0.0.1:17734) (DC: dc1)
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:34.353225 [WARN] manager: No servers available
jones - 2019/12/30 18:57:34.491630 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6.dc1 (Addr: tcp/127.0.0.1:17704) (DC: dc1)
TestAgent_PersistService - 2019/12/30 18:57:34.499350 [INFO] agent: Requesting shutdown
TestAgent_PersistService - 2019/12/30 18:57:34.499511 [INFO] consul: shutting down client
TestAgent_PersistService - 2019/12/30 18:57:34.499575 [WARN] serf: Shutdown without a Leave
TestAgent_PersistService - 2019/12/30 18:57:34.499896 [INFO] manager: shutting down
TestAgent_PersistService - 2019/12/30 18:57:34.579540 [INFO] agent: consul client down
TestAgent_PersistService - 2019/12/30 18:57:34.579633 [INFO] agent: shutdown complete
TestAgent_PersistService - 2019/12/30 18:57:34.579697 [INFO] agent: Stopping DNS server 127.0.0.1:17544 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.579915 [INFO] agent: Stopping DNS server 127.0.0.1:17544 (udp)
TestAgent_PersistService - 2019/12/30 18:57:34.580108 [INFO] agent: Stopping HTTP server 127.0.0.1:17545 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.580328 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PersistService - 2019/12/30 18:57:34.580407 [INFO] agent: Endpoints down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:34.585962 [WARN] manager: No servers available
TestAgent_PersistService - 2019/12/30 18:57:34.649765 [WARN] agent: Node name "Node a2913d22-7a35-d9f9-5dc5-ec62ab19abf6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_PersistService - 2019/12/30 18:57:34.650597 [DEBUG] tlsutil: Update with version 1
TestAgent_PersistService - 2019/12/30 18:57:34.655561 [INFO] serf: EventMemberJoin: Node a2913d22-7a35-d9f9-5dc5-ec62ab19abf6 127.0.0.1
TestAgent_PersistService - 2019/12/30 18:57:34.656587 [INFO] serf: Attempting re-join to previously known node: Node 7c804835-f75e-5b96-76cf-8f0666462101: 127.0.0.1:17547
TestAgent_PersistService - 2019/12/30 18:57:34.657731 [DEBUG] memberlist: Failed to join 127.0.0.1: dial tcp 127.0.0.1:17547: connect: connection refused
TestAgent_PersistService - 2019/12/30 18:57:34.657803 [WARN] serf: Failed to re-join any previously known node
TestAgent_PersistService - 2019/12/30 18:57:34.667620 [DEBUG] agent: restored service definition "redis" from "/tmp/consul-test/TestAgent_PersistService-agent190888330/services/86a1b907d54bf7010394bf316e183e67"
2019/12/30 18:57:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1b1a8b5d-270d-a79e-2889-1eb27500969f Address:127.0.0.1:17537}]
TestAgent_PurgeProxy - 2019/12/30 18:57:34.671282 [INFO] agent: Synced service "redis"
TestAgent_PurgeService - 2019/12/30 18:57:34.674104 [INFO] serf: EventMemberJoin: Node 1b1a8b5d-270d-a79e-2889-1eb27500969f.dc1 127.0.0.1
2019/12/30 18:57:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b6218bff-3840-427d-224e-0af870e363bd Address:127.0.0.1:17543}]
TestAgent_persistedService_compat - 2019/12/30 18:57:34.677663 [INFO] serf: EventMemberJoin: Node b6218bff-3840-427d-224e-0af870e363bd.dc1 127.0.0.1
2019/12/30 18:57:34 [INFO]  raft: Node at 127.0.0.1:17537 [Follower] entering Follower state (Leader: "")
TestAgent_PurgeService - 2019/12/30 18:57:34.677695 [INFO] serf: EventMemberJoin: Node 1b1a8b5d-270d-a79e-2889-1eb27500969f 127.0.0.1
TestAgent_PurgeService - 2019/12/30 18:57:34.683326 [INFO] agent: Started DNS server 127.0.0.1:17532 (udp)
2019/12/30 18:57:34 [INFO]  raft: Node at 127.0.0.1:17543 [Follower] entering Follower state (Leader: "")
TestAgent_persistedService_compat - 2019/12/30 18:57:34.686325 [INFO] serf: EventMemberJoin: Node b6218bff-3840-427d-224e-0af870e363bd 127.0.0.1
TestAgent_persistedService_compat - 2019/12/30 18:57:34.687855 [INFO] agent: Started DNS server 127.0.0.1:17538 (udp)
TestAgent_persistedService_compat - 2019/12/30 18:57:34.688388 [INFO] consul: Adding LAN server Node b6218bff-3840-427d-224e-0af870e363bd (Addr: tcp/127.0.0.1:17543) (DC: dc1)
TestAgent_persistedService_compat - 2019/12/30 18:57:34.688630 [INFO] consul: Handled member-join event for server "Node b6218bff-3840-427d-224e-0af870e363bd.dc1" in area "wan"
TestAgent_PurgeService - 2019/12/30 18:57:34.689699 [INFO] consul: Adding LAN server Node 1b1a8b5d-270d-a79e-2889-1eb27500969f (Addr: tcp/127.0.0.1:17537) (DC: dc1)
TestAgent_PurgeService - 2019/12/30 18:57:34.690111 [INFO] consul: Handled member-join event for server "Node 1b1a8b5d-270d-a79e-2889-1eb27500969f.dc1" in area "wan"
TestAgent_PurgeService - 2019/12/30 18:57:34.690889 [INFO] agent: Started DNS server 127.0.0.1:17532 (tcp)
TestAgent_persistedService_compat - 2019/12/30 18:57:34.692528 [INFO] agent: Started DNS server 127.0.0.1:17538 (tcp)
TestAgent_PurgeService - 2019/12/30 18:57:34.694819 [INFO] agent: Started HTTP server on 127.0.0.1:17533 (tcp)
TestAgent_PurgeService - 2019/12/30 18:57:34.694943 [INFO] agent: started state syncer
TestAgent_persistedService_compat - 2019/12/30 18:57:34.695998 [INFO] agent: Started HTTP server on 127.0.0.1:17539 (tcp)
TestAgent_persistedService_compat - 2019/12/30 18:57:34.696230 [INFO] agent: started state syncer
TestAgent_PersistService - 2019/12/30 18:57:34.704228 [INFO] agent: Started DNS server 127.0.0.1:17550 (udp)
TestAgent_PersistService - 2019/12/30 18:57:34.706012 [INFO] agent: Started DNS server 127.0.0.1:17550 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.713227 [INFO] agent: Started HTTP server on 127.0.0.1:17551 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.713570 [INFO] agent: started state syncer
TestAgent_PersistService - 2019/12/30 18:57:34.713878 [WARN] manager: No servers available
TestAgent_PersistService - 2019/12/30 18:57:34.715508 [ERR] agent: failed to sync remote state: No known Consul servers
TestAgent_PersistService - 2019/12/30 18:57:34.717080 [INFO] agent: Requesting shutdown
TestAgent_PersistService - 2019/12/30 18:57:34.717379 [INFO] consul: shutting down client
TestAgent_PersistService - 2019/12/30 18:57:34.717721 [WARN] serf: Shutdown without a Leave
TestAgent_PersistService - 2019/12/30 18:57:34.717898 [INFO] manager: shutting down
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:34.722816 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:34.731719 [WARN] manager: No servers available
2019/12/30 18:57:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:34 [INFO]  raft: Node at 127.0.0.1:17537 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:34 [INFO]  raft: Node at 127.0.0.1:17543 [Candidate] entering Candidate state in term 2
TestAgent_PersistProxy - 2019/12/30 18:57:34.825427 [WARN] manager: No servers available
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:34.870504 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:34.890440 [WARN] manager: No servers available
TestAgent_PersistService - 2019/12/30 18:57:34.921396 [INFO] agent: consul client down
TestAgent_PersistService - 2019/12/30 18:57:34.921494 [INFO] agent: shutdown complete
TestAgent_PersistService - 2019/12/30 18:57:34.921566 [INFO] agent: Stopping DNS server 127.0.0.1:17550 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.921774 [INFO] agent: Stopping DNS server 127.0.0.1:17550 (udp)
TestAgent_PersistService - 2019/12/30 18:57:34.921983 [INFO] agent: Stopping HTTP server 127.0.0.1:17551 (tcp)
TestAgent_PersistService - 2019/12/30 18:57:34.922207 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PersistService - 2019/12/30 18:57:34.922286 [INFO] agent: Endpoints down
=== CONT  TestAgent_updateTTLCheck
--- PASS: TestAgent_PersistService (0.81s)
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:35.041550 [WARN] manager: No servers available
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_updateTTLCheck - 2019/12/30 18:57:35.084012 [WARN] agent: Node name "Node 5a738e85-9c2e-b7a7-536b-f86c7afc9861" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_updateTTLCheck - 2019/12/30 18:57:35.087291 [DEBUG] tlsutil: Update with version 1
TestAgent_updateTTLCheck - 2019/12/30 18:57:35.091440 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:35.177306 [WARN] manager: No servers available
TestAgent_PurgeProxy - 2019/12/30 18:57:35.207377 [INFO] agent: Synced service "redis-proxy"
TestAgent_PurgeProxy - 2019/12/30 18:57:35.207497 [DEBUG] agent: Check "service:redis-proxy" in sync
TestAgent_PurgeProxy - 2019/12/30 18:57:35.207537 [DEBUG] agent: Node info in sync
TestAgent_PurgeProxyOnDuplicate - 2019/12/30 18:57:35.302799 [WARN] manager: No servers available
2019/12/30 18:57:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:35 [INFO]  raft: Node at 127.0.0.1:17537 [Leader] entering Leader state
2019/12/30 18:57:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:35 [INFO]  raft: Node at 127.0.0.1:17543 [Leader] entering Leader state
TestAgent_PurgeService - 2019/12/30 18:57:35.421694 [INFO] consul: cluster leadership acquired
TestAgent_PurgeService - 2019/12/30 18:57:35.422130 [INFO] consul: New leader elected: Node 1b1a8b5d-270d-a79e-2889-1eb27500969f
TestAgent_persistedService_compat - 2019/12/30 18:57:35.422382 [INFO] consul: cluster leadership acquired
TestAgent_persistedService_compat - 2019/12/30 18:57:35.422730 [INFO] consul: New leader elected: Node b6218bff-3840-427d-224e-0af870e363bd
TestAgent_PurgeProxy - 2019/12/30 18:57:35.427164 [DEBUG] agent: removed check "service:redis-proxy"
TestAgent_PurgeProxy - 2019/12/30 18:57:35.427255 [DEBUG] agent: removed service "redis-proxy"
TestAgent_PurgeService - 2019/12/30 18:57:35.524626 [DEBUG] agent: removed service "redis"
TestAgent_PersistProxy - 2019/12/30 18:57:35.557024 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:35.570168 [WARN] manager: No servers available
TestAgent_PurgeProxy - 2019/12/30 18:57:35.733974 [DEBUG] agent: removed check "service:redis-proxy"
TestAgent_PurgeProxy - 2019/12/30 18:57:35.734080 [DEBUG] agent: removed service "redis-proxy"
TestAgent_PurgeProxy - 2019/12/30 18:57:35.735187 [INFO] agent: Requesting shutdown
TestAgent_PurgeProxy - 2019/12/30 18:57:35.735312 [INFO] consul: shutting down server
TestAgent_PurgeProxy - 2019/12/30 18:57:35.735507 [WARN] serf: Shutdown without a Leave
TestAgent_PersistProxy - 2019/12/30 18:57:35.768993 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:35.773114 [WARN] manager: No servers available
TestAgent_persistedService_compat - 2019/12/30 18:57:35.837020 [INFO] agent: Synced node info
TestAgent_persistedService_compat - 2019/12/30 18:57:35.837226 [DEBUG] agent: restored service definition "redis" from "/tmp/TestAgent_persistedService_compat-agent120697111/services/86a1b907d54bf7010394bf316e183e67"
TestAgent_persistedService_compat - 2019/12/30 18:57:35.837494 [INFO] agent: Requesting shutdown
TestAgent_PurgeService - 2019/12/30 18:57:35.837519 [INFO] agent: Synced service "redis"
TestAgent_persistedService_compat - 2019/12/30 18:57:35.837557 [INFO] consul: shutting down server
TestAgent_persistedService_compat - 2019/12/30 18:57:35.837603 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeService - 2019/12/30 18:57:35.837563 [DEBUG] agent: Node info in sync
TestAgent_PurgeService - 2019/12/30 18:57:35.840355 [DEBUG] agent: Service "redis" in sync
TestAgent_PurgeService - 2019/12/30 18:57:35.840413 [DEBUG] agent: Node info in sync
TestAgent_PurgeService - 2019/12/30 18:57:35.841845 [DEBUG] agent: removed service "redis"
TestAgent_PurgeService - 2019/12/30 18:57:35.841988 [INFO] agent: Requesting shutdown
TestAgent_PurgeService - 2019/12/30 18:57:35.842050 [INFO] consul: shutting down server
TestAgent_PurgeService - 2019/12/30 18:57:35.842097 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeProxy - 2019/12/30 18:57:35.843268 [WARN] serf: Shutdown without a Leave
TestAgent_PersistProxy - 2019/12/30 18:57:35.864514 [WARN] manager: No servers available
TestAgent_PurgeProxy - 2019/12/30 18:57:35.926243 [DEBUG] agent: Service "redis" in sync
TestAgent_PurgeService - 2019/12/30 18:57:35.961255 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeProxy - 2019/12/30 18:57:35.961331 [INFO] manager: shutting down
TestAgent_persistedService_compat - 2019/12/30 18:57:35.961255 [WARN] serf: Shutdown without a Leave
TestAgent_PurgeProxy - 2019/12/30 18:57:35.961491 [WARN] agent: Deregistering service "redis-proxy" failed. raft is already shutdown
TestAgent_PurgeProxy - 2019/12/30 18:57:35.961539 [ERR] agent: failed to sync remote state: raft is already shutdown
TestAgent_PurgeProxyOnDuplicate-a2 - 2019/12/30 18:57:35.964925 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:35.977375 [WARN] manager: No servers available
TestAgent_persistedService_compat - 2019/12/30 18:57:36.061491 [INFO] manager: shutting down
TestAgent_PurgeService - 2019/12/30 18:57:36.062535 [INFO] manager: shutting down
TestAgent_PurgeProxy - 2019/12/30 18:57:36.064372 [INFO] agent: consul server down
TestAgent_PurgeProxy - 2019/12/30 18:57:36.064918 [INFO] agent: shutdown complete
TestAgent_PurgeProxy - 2019/12/30 18:57:36.065172 [INFO] agent: Stopping DNS server 127.0.0.1:18989 (tcp)
TestAgent_PurgeProxy - 2019/12/30 18:57:36.066067 [INFO] agent: Stopping DNS server 127.0.0.1:18989 (udp)
TestAgent_PurgeService - 2019/12/30 18:57:36.065670 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_PurgeService - 2019/12/30 18:57:36.066071 [INFO] agent: consul server down
TestAgent_PurgeService - 2019/12/30 18:57:36.066410 [INFO] agent: shutdown complete
TestAgent_PurgeService - 2019/12/30 18:57:36.066463 [INFO] agent: Stopping DNS server 127.0.0.1:17532 (tcp)
TestAgent_persistedService_compat - 2019/12/30 18:57:36.066577 [INFO] agent: consul server down
TestAgent_persistedService_compat - 2019/12/30 18:57:36.066631 [INFO] agent: shutdown complete
TestAgent_persistedService_compat - 2019/12/30 18:57:36.066687 [INFO] agent: Stopping DNS server 127.0.0.1:17538 (tcp)
TestAgent_persistedService_compat - 2019/12/30 18:57:36.066746 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_persistedService_compat - 2019/12/30 18:57:36.066828 [INFO] agent: Stopping DNS server 127.0.0.1:17538 (udp)
TestAgent_PurgeProxy - 2019/12/30 18:57:36.066933 [INFO] agent: Stopping HTTP server 127.0.0.1:18990 (tcp)
TestAgent_PurgeProxy - 2019/12/30 18:57:36.066184 [ERR] connect: Apply failed leadership lost while committing log
TestAgent_PurgeProxy - 2019/12/30 18:57:36.067154 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_persistedService_compat - 2019/12/30 18:57:36.066984 [INFO] agent: Stopping HTTP server 127.0.0.1:17539 (tcp)
TestAgent_persistedService_compat - 2019/12/30 18:57:36.067901 [INFO] agent: Waiting for endpoints to shut down
TestAgent_persistedService_compat - 2019/12/30 18:57:36.068017 [INFO] agent: Endpoints down
--- PASS: TestAgent_persistedService_compat (2.61s)
=== CONT  TestAgent_HTTPCheck_TLSSkipVerify
TestAgent_PurgeService - 2019/12/30 18:57:36.066599 [INFO] agent: Stopping DNS server 127.0.0.1:17532 (udp)
TestAgent_PurgeProxy - 2019/12/30 18:57:36.074935 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeService - 2019/12/30 18:57:36.075156 [INFO] agent: Stopping HTTP server 127.0.0.1:17533 (tcp)
TestAgent_PurgeService - 2019/12/30 18:57:36.075484 [INFO] agent: Waiting for endpoints to shut down
TestAgent_PurgeService - 2019/12/30 18:57:36.075872 [INFO] agent: Endpoints down
--- PASS: TestAgent_PurgeService (2.70s)
TestAgent_PurgeProxy - 2019/12/30 18:57:36.075873 [INFO] agent: Endpoints down
=== CONT  TestAgent_RemoveCheck
TestAgent_persistedService_compat - 2019/12/30 18:57:36.067035 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_persistedService_compat - 2019/12/30 18:57:36.076416 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_persistedService_compat - 2019/12/30 18:57:36.076483 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
--- PASS: TestAgent_PurgeProxy (4.33s)
=== CONT  TestAgent_AddCheck_Alias_userAndSetToken
TestAgent_PersistProxy - 2019/12/30 18:57:36.081557 [WARN] manager: No servers available
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:36.190496 [WARN] agent: Node name "Node 83b876b8-be5c-163e-8dfb-b458c2043e24" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:36.190909 [DEBUG] tlsutil: Update with version 1
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:36.193029 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:36.201522 [WARN] agent: Node name "Node 77821a6a-985f-0a77-964c-06c047ac1cee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:36.201906 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:36.206572 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_PersistProxy - 2019/12/30 18:57:36.220858 [WARN] manager: No servers available
2019/12/30 18:57:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5a738e85-9c2e-b7a7-536b-f86c7afc9861 Address:127.0.0.1:17561}]
2019/12/30 18:57:36 [INFO]  raft: Node at 127.0.0.1:17561 [Follower] entering Follower state (Leader: "")
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.230252 [INFO] serf: EventMemberJoin: Node 5a738e85-9c2e-b7a7-536b-f86c7afc9861.dc1 127.0.0.1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_RemoveCheck - 2019/12/30 18:57:36.235989 [WARN] agent: Node name "Node 6b32404e-5e8f-5c04-f25d-87e6104d09d5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.236104 [INFO] serf: EventMemberJoin: Node 5a738e85-9c2e-b7a7-536b-f86c7afc9861 127.0.0.1
TestAgent_RemoveCheck - 2019/12/30 18:57:36.236539 [DEBUG] tlsutil: Update with version 1
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.236791 [INFO] consul: Handled member-join event for server "Node 5a738e85-9c2e-b7a7-536b-f86c7afc9861.dc1" in area "wan"
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.237061 [INFO] consul: Adding LAN server Node 5a738e85-9c2e-b7a7-536b-f86c7afc9861 (Addr: tcp/127.0.0.1:17561) (DC: dc1)
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.237459 [INFO] agent: Started DNS server 127.0.0.1:17556 (udp)
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.237643 [INFO] agent: Started DNS server 127.0.0.1:17556 (tcp)
TestAgent_RemoveCheck - 2019/12/30 18:57:36.238928 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.240198 [INFO] agent: Started HTTP server on 127.0.0.1:17557 (tcp)
TestAgent_updateTTLCheck - 2019/12/30 18:57:36.240292 [INFO] agent: started state syncer
2019/12/30 18:57:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:36 [INFO]  raft: Node at 127.0.0.1:17561 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:57:36.436805 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:36.436924 [DEBUG] agent: Node info in sync
jones - 2019/12/30 18:57:36.467154 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 90e88a15-5862-4de0-2f1f-c638261bac76.dc1 (Addr: tcp/127.0.0.1:17698) (DC: dc1)
TestAgent_PersistProxy - 2019/12/30 18:57:36.481315 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:36.787998 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:36.910004 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:37.023970 [WARN] manager: No servers available
TestAgent_PersistProxy - 2019/12/30 18:57:37.046287 [WARN] manager: No servers available
2019/12/30 18:57:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17561 [Leader] entering Leader state
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.246770 [INFO] consul: cluster leadership acquired
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.247205 [INFO] consul: New leader elected: Node 5a738e85-9c2e-b7a7-536b-f86c7afc9861
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.274916 [DEBUG] agent: Check "mem" status is now passing
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.276730 [DEBUG] agent: Check "mem" status is now critical
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.282072 [INFO] agent: Requesting shutdown
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.282312 [INFO] consul: shutting down server
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.282372 [WARN] serf: Shutdown without a Leave
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.282595 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.431140 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6b32404e-5e8f-5c04-f25d-87e6104d09d5 Address:127.0.0.1:17573}]
2019/12/30 18:57:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:77821a6a-985f-0a77-964c-06c047ac1cee Address:127.0.0.1:17579}]
TestAgent_RemoveCheck - 2019/12/30 18:57:37.540225 [INFO] serf: EventMemberJoin: Node 6b32404e-5e8f-5c04-f25d-87e6104d09d5.dc1 127.0.0.1
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17579 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17573 [Follower] entering Follower state (Leader: "")
TestAgent_RemoveCheck - 2019/12/30 18:57:37.543767 [INFO] serf: EventMemberJoin: Node 6b32404e-5e8f-5c04-f25d-87e6104d09d5 127.0.0.1
TestAgent_RemoveCheck - 2019/12/30 18:57:37.545522 [INFO] agent: Started DNS server 127.0.0.1:17568 (udp)
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.548951 [INFO] manager: shutting down
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.555594 [INFO] serf: EventMemberJoin: Node 77821a6a-985f-0a77-964c-06c047ac1cee.dc1 127.0.0.1
TestAgent_RemoveCheck - 2019/12/30 18:57:37.548208 [INFO] consul: Adding LAN server Node 6b32404e-5e8f-5c04-f25d-87e6104d09d5 (Addr: tcp/127.0.0.1:17573) (DC: dc1)
TestAgent_RemoveCheck - 2019/12/30 18:57:37.548364 [INFO] consul: Handled member-join event for server "Node 6b32404e-5e8f-5c04-f25d-87e6104d09d5.dc1" in area "wan"
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.562624 [INFO] serf: EventMemberJoin: Node 77821a6a-985f-0a77-964c-06c047ac1cee 127.0.0.1
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.563376 [INFO] consul: Adding LAN server Node 77821a6a-985f-0a77-964c-06c047ac1cee (Addr: tcp/127.0.0.1:17579) (DC: dc1)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.563867 [INFO] consul: Handled member-join event for server "Node 77821a6a-985f-0a77-964c-06c047ac1cee.dc1" in area "wan"
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.564792 [INFO] agent: Started DNS server 127.0.0.1:17574 (tcp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.565131 [INFO] agent: Started DNS server 127.0.0.1:17574 (udp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.568023 [INFO] agent: Started HTTP server on 127.0.0.1:17575 (tcp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:37.568136 [INFO] agent: started state syncer
TestAgent_RemoveCheck - 2019/12/30 18:57:37.568807 [INFO] agent: Started DNS server 127.0.0.1:17568 (tcp)
TestAgent_RemoveCheck - 2019/12/30 18:57:37.571803 [INFO] agent: Started HTTP server on 127.0.0.1:17569 (tcp)
TestAgent_RemoveCheck - 2019/12/30 18:57:37.571923 [INFO] agent: started state syncer
2019/12/30 18:57:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17573 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17579 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:83b876b8-be5c-163e-8dfb-b458c2043e24 Address:127.0.0.1:17567}]
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.651228 [INFO] serf: EventMemberJoin: Node 83b876b8-be5c-163e-8dfb-b458c2043e24.dc1 127.0.0.1
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17567 [Follower] entering Follower state (Leader: "")
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.676829 [INFO] serf: EventMemberJoin: Node 83b876b8-be5c-163e-8dfb-b458c2043e24 127.0.0.1
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.679363 [INFO] consul: Adding LAN server Node 83b876b8-be5c-163e-8dfb-b458c2043e24 (Addr: tcp/127.0.0.1:17567) (DC: dc1)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.680224 [INFO] consul: Handled member-join event for server "Node 83b876b8-be5c-163e-8dfb-b458c2043e24.dc1" in area "wan"
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.683335 [INFO] agent: Started DNS server 127.0.0.1:17562 (udp)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.683778 [INFO] agent: Started DNS server 127.0.0.1:17562 (tcp)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.686391 [INFO] agent: Started HTTP server on 127.0.0.1:17563 (tcp)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:37.686500 [INFO] agent: started state syncer
2019/12/30 18:57:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:37 [INFO]  raft: Node at 127.0.0.1:17567 [Candidate] entering Candidate state in term 2
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.878294 [INFO] agent: consul server down
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.878427 [INFO] agent: shutdown complete
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.878500 [INFO] agent: Stopping DNS server 127.0.0.1:17556 (tcp)
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.878685 [INFO] agent: Stopping DNS server 127.0.0.1:17556 (udp)
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.878904 [INFO] agent: Stopping HTTP server 127.0.0.1:17557 (tcp)
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.879163 [INFO] agent: Waiting for endpoints to shut down
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.879253 [INFO] agent: Endpoints down
--- PASS: TestAgent_updateTTLCheck (2.95s)
=== CONT  TestAgent_AddCheck_Alias_userToken
TestAgent_updateTTLCheck - 2019/12/30 18:57:37.888400 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:37.995176 [WARN] agent: Node name "Node 81ec4519-0fb0-9acd-e931-57a9542cc431" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:37.995748 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:37.998279 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:38 [INFO]  raft: Node at 127.0.0.1:17573 [Leader] entering Leader state
TestAgent_RemoveCheck - 2019/12/30 18:57:38.562987 [INFO] consul: cluster leadership acquired
TestAgent_RemoveCheck - 2019/12/30 18:57:38.563465 [INFO] consul: New leader elected: Node 6b32404e-5e8f-5c04-f25d-87e6104d09d5
TestAgent_RemoveCheck - 2019/12/30 18:57:38.704375 [DEBUG] agent: removed check "mem"
TestAgent_RemoveCheck - 2019/12/30 18:57:38.704633 [DEBUG] agent: removed check "mem"
TestAgent_RemoveCheck - 2019/12/30 18:57:38.704696 [INFO] agent: Requesting shutdown
TestAgent_RemoveCheck - 2019/12/30 18:57:38.704744 [INFO] consul: shutting down server
TestAgent_RemoveCheck - 2019/12/30 18:57:38.704787 [WARN] serf: Shutdown without a Leave
TestAgent_RemoveCheck - 2019/12/30 18:57:38.705028 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:57:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:38 [INFO]  raft: Node at 127.0.0.1:17579 [Leader] entering Leader state
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.774630 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.775075 [INFO] consul: New leader elected: Node 77821a6a-985f-0a77-964c-06c047ac1cee
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.798752 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.798891 [INFO] consul: shutting down server
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.798944 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.799337 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_RemoveCheck - 2019/12/30 18:57:38.861281 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:38 [INFO]  raft: Node at 127.0.0.1:17567 [Leader] entering Leader state
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:38.865737 [INFO] consul: cluster leadership acquired
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:38.866152 [INFO] consul: New leader elected: Node 83b876b8-be5c-163e-8dfb-b458c2043e24
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:38.952945 [WARN] serf: Shutdown without a Leave
TestAgent_RemoveCheck - 2019/12/30 18:57:38.956370 [INFO] manager: shutting down
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.044679 [INFO] manager: shutting down
TestAgent_RemoveCheck - 2019/12/30 18:57:39.127962 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128252 [INFO] agent: consul server down
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128302 [INFO] agent: shutdown complete
TestAgent_RemoveCheck - 2019/12/30 18:57:39.128318 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128356 [INFO] agent: Stopping DNS server 127.0.0.1:17574 (tcp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128477 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128486 [INFO] agent: Stopping DNS server 127.0.0.1:17574 (udp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128692 [INFO] agent: Stopping HTTP server 127.0.0.1:17575 (tcp)
TestAgent_RemoveCheck - 2019/12/30 18:57:39.128706 [INFO] agent: consul server down
TestAgent_RemoveCheck - 2019/12/30 18:57:39.128746 [INFO] agent: shutdown complete
TestAgent_RemoveCheck - 2019/12/30 18:57:39.128794 [INFO] agent: Stopping DNS server 127.0.0.1:17568 (tcp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128866 [INFO] agent: Waiting for endpoints to shut down
TestAgent_RemoveCheck - 2019/12/30 18:57:39.128910 [INFO] agent: Stopping DNS server 127.0.0.1:17568 (udp)
TestAgent_AddCheck_Alias_userAndSetToken - 2019/12/30 18:57:39.128925 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_Alias_userAndSetToken (3.05s)
TestAgent_RemoveCheck - 2019/12/30 18:57:39.129042 [INFO] agent: Stopping HTTP server 127.0.0.1:17569 (tcp)
=== CONT  TestAgent_AddCheck_Alias_setToken
TestAgent_RemoveCheck - 2019/12/30 18:57:39.129209 [INFO] agent: Waiting for endpoints to shut down
TestAgent_RemoveCheck - 2019/12/30 18:57:39.129269 [INFO] agent: Endpoints down
--- PASS: TestAgent_RemoveCheck (3.05s)
=== CONT  TestAgent_AddCheck_Alias
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_Alias - 2019/12/30 18:57:39.198566 [WARN] agent: Node name "Node c4017e6a-eadd-1f33-0075-46c363b8a993" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_Alias - 2019/12/30 18:57:39.199156 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_Alias - 2019/12/30 18:57:39.201644 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:39.203688 [INFO] agent: Synced node info
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:39.203887 [WARN] agent: check 'tls' has interval below minimum of 1s
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:39.203949 [DEBUG] tlsutil: OutgoingTLSConfigForCheck with version 1
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:39.205265 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:39.234324 [WARN] agent: Node name "Node c104203f-479d-3cdf-9cae-87b96ee1b04c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:39.235007 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:39.238566 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:81ec4519-0fb0-9acd-e931-57a9542cc431 Address:127.0.0.1:17585}]
2019/12/30 18:57:39 [INFO]  raft: Node at 127.0.0.1:17585 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.306972 [INFO] serf: EventMemberJoin: Node 81ec4519-0fb0-9acd-e931-57a9542cc431.dc1 127.0.0.1
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.310399 [INFO] serf: EventMemberJoin: Node 81ec4519-0fb0-9acd-e931-57a9542cc431 127.0.0.1
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.311616 [INFO] consul: Adding LAN server Node 81ec4519-0fb0-9acd-e931-57a9542cc431 (Addr: tcp/127.0.0.1:17585) (DC: dc1)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.312045 [INFO] consul: Handled member-join event for server "Node 81ec4519-0fb0-9acd-e931-57a9542cc431.dc1" in area "wan"
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.312189 [INFO] agent: Started DNS server 127.0.0.1:17580 (udp)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.312737 [INFO] agent: Started DNS server 127.0.0.1:17580 (tcp)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.315197 [INFO] agent: Started HTTP server on 127.0.0.1:17581 (tcp)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.315300 [INFO] agent: started state syncer
2019/12/30 18:57:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:39 [INFO]  raft: Node at 127.0.0.1:17585 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:39 [INFO]  raft: Node at 127.0.0.1:17585 [Leader] entering Leader state
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.934985 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:39.935502 [INFO] consul: New leader elected: Node 81ec4519-0fb0-9acd-e931-57a9542cc431
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:40.071283 [DEBUG] agent: Check "tls" is passing
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:40.076588 [INFO] agent: Requesting shutdown
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:40.076757 [INFO] consul: shutting down server
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:40.076814 [WARN] serf: Shutdown without a Leave
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:40.222958 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c104203f-479d-3cdf-9cae-87b96ee1b04c Address:127.0.0.1:17591}]
2019/12/30 18:57:40 [INFO]  raft: Node at 127.0.0.1:17591 [Follower] entering Follower state (Leader: "")
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:40.343544 [INFO] manager: shutting down
2019/12/30 18:57:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c4017e6a-eadd-1f33-0075-46c363b8a993 Address:127.0.0.1:17597}]
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.348699 [INFO] serf: EventMemberJoin: Node c4017e6a-eadd-1f33-0075-46c363b8a993.dc1 127.0.0.1
2019/12/30 18:57:40 [INFO]  raft: Node at 127.0.0.1:17597 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.355496 [INFO] serf: EventMemberJoin: Node c104203f-479d-3cdf-9cae-87b96ee1b04c.dc1 127.0.0.1
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.396877 [INFO] serf: EventMemberJoin: Node c4017e6a-eadd-1f33-0075-46c363b8a993 127.0.0.1
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.397531 [INFO] serf: EventMemberJoin: Node c104203f-479d-3cdf-9cae-87b96ee1b04c 127.0.0.1
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.398174 [INFO] agent: Started DNS server 127.0.0.1:17592 (udp)
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.398663 [INFO] agent: Started DNS server 127.0.0.1:17586 (udp)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.398674 [INFO] consul: Adding LAN server Node c4017e6a-eadd-1f33-0075-46c363b8a993 (Addr: tcp/127.0.0.1:17597) (DC: dc1)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.398849 [INFO] consul: Handled member-join event for server "Node c4017e6a-eadd-1f33-0075-46c363b8a993.dc1" in area "wan"
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.399329 [INFO] consul: Adding LAN server Node c104203f-479d-3cdf-9cae-87b96ee1b04c (Addr: tcp/127.0.0.1:17591) (DC: dc1)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.399602 [INFO] agent: Started DNS server 127.0.0.1:17592 (tcp)
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.399934 [INFO] consul: Handled member-join event for server "Node c104203f-479d-3cdf-9cae-87b96ee1b04c.dc1" in area "wan"
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.400529 [INFO] agent: Started DNS server 127.0.0.1:17586 (tcp)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.402320 [INFO] agent: Started HTTP server on 127.0.0.1:17593 (tcp)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:40.402566 [INFO] agent: started state syncer
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.403405 [INFO] agent: Started HTTP server on 127.0.0.1:17587 (tcp)
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:40.403671 [INFO] agent: started state syncer
2019/12/30 18:57:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:40 [INFO]  raft: Node at 127.0.0.1:17597 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:40 [INFO]  raft: Node at 127.0.0.1:17591 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:40 [ERR] consul.watch: Watch (type: key) errored: Get https://127.0.0.1:18829/v1/kv/asdf: dial tcp 127.0.0.1:18829: connect: connection refused, retry in 45s
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:40.662694 [INFO] agent: Synced node info
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:40.663508 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:40.663621 [INFO] consul: shutting down server
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:40.663736 [WARN] serf: Shutdown without a Leave
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.020271 [INFO] agent: consul server down
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.020436 [INFO] agent: shutdown complete
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.020533 [INFO] agent: Stopping DNS server 127.0.0.1:17562 (tcp)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.020768 [INFO] agent: Stopping DNS server 127.0.0.1:17562 (udp)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.021038 [INFO] agent: Stopping HTTP server 127.0.0.1:17563 (tcp)
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.021372 [INFO] agent: Waiting for endpoints to shut down
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.021507 [INFO] agent: Endpoints down
--- PASS: TestAgent_HTTPCheck_TLSSkipVerify (4.95s)
=== CONT  TestAgent_AddCheck_GRPC
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.022278 [ERR] connect: Apply failed leadership lost while committing log
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.022364 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.022706 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.022804 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.022900 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestAgent_HTTPCheck_TLSSkipVerify - 2019/12/30 18:57:41.022996 [ERR] consul: failed to transfer leadership in 3 attempts
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:41.031409 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:41.115040 [WARN] agent: Node name "Node 060dc47b-5c29-f72d-ab44-2a4b8399615d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:41.116178 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:41.119679 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:41.457088 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:41.457685 [DEBUG] agent: Node info in sync
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:41.463677 [INFO] manager: shutting down
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:41.463937 [WARN] agent: Syncing check "aliashealth" failed. raft is already shutdown
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:41.464007 [ERR] agent: failed to sync remote state: raft is already shutdown
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.261932 [INFO] agent: consul server down
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.262028 [INFO] agent: shutdown complete
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.262096 [INFO] agent: Stopping DNS server 127.0.0.1:17580 (tcp)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.262251 [INFO] agent: Stopping DNS server 127.0.0.1:17580 (udp)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.262436 [INFO] agent: Stopping HTTP server 127.0.0.1:17581 (tcp)
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.262665 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.262741 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_Alias_userToken (4.38s)
=== CONT  TestAgent_AddCheck_ExecRemoteDisable
jones - 2019/12/30 18:57:42.266497 [DEBUG] consul: Skipping self join check for "Node f3b42e12-d0ce-84ea-a863-97fa8b8c0786" since the cluster is too small
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.267084 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.267338 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.267541 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_AddCheck_Alias_userToken - 2019/12/30 18:57:42.267597 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:42.356736 [WARN] agent: Node name "Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:42.357178 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:42.359457 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:43.142653 [DEBUG] manager: Rebalanced 1 servers, next active server is Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d.dc1 (Addr: tcp/127.0.0.1:17716) (DC: dc1)
2019/12/30 18:57:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:43 [INFO]  raft: Node at 127.0.0.1:17591 [Leader] entering Leader state
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.384785 [INFO] consul: cluster leadership acquired
2019/12/30 18:57:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:43 [INFO]  raft: Node at 127.0.0.1:17597 [Leader] entering Leader state
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.385296 [INFO] consul: New leader elected: Node c104203f-479d-3cdf-9cae-87b96ee1b04c
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.385513 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.385863 [INFO] consul: New leader elected: Node c4017e6a-eadd-1f33-0075-46c363b8a993
jones - 2019/12/30 18:57:43.387423 [DEBUG] consul: Skipping self join check for "Node 90e88a15-5862-4de0-2f1f-c638261bac76" since the cluster is too small
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.741058 [INFO] agent: Synced node info
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.741782 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.742018 [INFO] consul: shutting down server
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.749294 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:060dc47b-5c29-f72d-ab44-2a4b8399615d Address:127.0.0.1:17603}]
2019/12/30 18:57:43 [INFO]  raft: Node at 127.0.0.1:17603 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.833272 [INFO] serf: EventMemberJoin: Node 060dc47b-5c29-f72d-ab44-2a4b8399615d.dc1 127.0.0.1
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.852334 [INFO] serf: EventMemberJoin: Node 060dc47b-5c29-f72d-ab44-2a4b8399615d 127.0.0.1
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.854679 [INFO] consul: Adding LAN server Node 060dc47b-5c29-f72d-ab44-2a4b8399615d (Addr: tcp/127.0.0.1:17603) (DC: dc1)
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.855518 [INFO] consul: Handled member-join event for server "Node 060dc47b-5c29-f72d-ab44-2a4b8399615d.dc1" in area "wan"
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.858094 [INFO] agent: Started DNS server 127.0.0.1:17598 (tcp)
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.858777 [INFO] agent: Started DNS server 127.0.0.1:17598 (udp)
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.861714 [INFO] agent: Started HTTP server on 127.0.0.1:17599 (tcp)
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:43.861817 [INFO] agent: started state syncer
2019/12/30 18:57:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:43 [INFO]  raft: Node at 127.0.0.1:17603 [Candidate] entering Candidate state in term 2
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.887837 [INFO] agent: Synced node info
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.892722 [DEBUG] agent: Node info in sync
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.892842 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.893471 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.893704 [INFO] consul: shutting down server
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.893758 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_Alias - 2019/12/30 18:57:43.969735 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.969846 [INFO] manager: shutting down
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970552 [INFO] agent: consul server down
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970628 [INFO] agent: shutdown complete
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970693 [INFO] agent: Stopping DNS server 127.0.0.1:17586 (tcp)
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970630 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970847 [INFO] agent: Stopping DNS server 127.0.0.1:17586 (udp)
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970903 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.970964 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.971007 [INFO] agent: Stopping HTTP server 127.0.0.1:17587 (tcp)
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.971221 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_Alias_setToken - 2019/12/30 18:57:43.971306 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_Alias_setToken (4.84s)
=== CONT  TestAgent_AddCheck_ExecDisable
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.120956 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:44.185709 [WARN] agent: Node name "Node 80a38fc6-94d6-dc1e-80ef-88fbdc3702bf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:44.186098 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:44.188398 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.636466 [INFO] agent: consul server down
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.636573 [INFO] agent: shutdown complete
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.636654 [INFO] agent: Stopping DNS server 127.0.0.1:17592 (tcp)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.636840 [INFO] agent: Stopping DNS server 127.0.0.1:17592 (udp)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.637043 [INFO] agent: Stopping HTTP server 127.0.0.1:17593 (tcp)
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.637300 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.637387 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_Alias (5.51s)
=== CONT  TestAgent_AddCheck_RestoreState
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.640438 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_AddCheck_Alias - 2019/12/30 18:57:44.640903 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:44.718180 [WARN] agent: Node name "Node f52c6b5f-9cd9-0f2f-9715-ce543442f0d9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:44.718642 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:44.720981 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:328126e5-27dd-1833-8ea3-e65e5d25fcfd Address:127.0.0.1:17609}]
2019/12/30 18:57:44 [INFO]  raft: Node at 127.0.0.1:17609 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.820595 [INFO] serf: EventMemberJoin: Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd.dc1 127.0.0.1
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.824180 [INFO] serf: EventMemberJoin: Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd 127.0.0.1
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.826314 [INFO] consul: Adding LAN server Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd (Addr: tcp/127.0.0.1:17609) (DC: dc1)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.830457 [INFO] consul: Handled member-join event for server "Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd.dc1" in area "wan"
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.833607 [INFO] agent: Started DNS server 127.0.0.1:17604 (udp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.834172 [INFO] agent: Started DNS server 127.0.0.1:17604 (tcp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.836742 [INFO] agent: Started HTTP server on 127.0.0.1:17605 (tcp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:44.836891 [INFO] agent: started state syncer
2019/12/30 18:57:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:44 [INFO]  raft: Node at 127.0.0.1:17609 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:45 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:45 [INFO]  raft: Node at 127.0.0.1:17603 [Leader] entering Leader state
jones - 2019/12/30 18:57:45.553541 [DEBUG] consul: Skipping self join check for "Node 39b35a7a-e61e-e83c-e3aa-2917305b86d6" since the cluster is too small
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.553915 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.554371 [INFO] consul: New leader elected: Node 060dc47b-5c29-f72d-ab44-2a4b8399615d
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.793582 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.793703 [INFO] consul: shutting down server
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.793758 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.794227 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:45.928034 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.011484 [INFO] manager: shutting down
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.012133 [INFO] agent: consul server down
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.013457 [INFO] agent: shutdown complete
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.013573 [INFO] agent: Stopping DNS server 127.0.0.1:17598 (tcp)
2019/12/30 18:57:46 [INFO]  raft: Election won. Tally: 1
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.013795 [INFO] agent: Stopping DNS server 127.0.0.1:17598 (udp)
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.013966 [INFO] agent: Stopping HTTP server 127.0.0.1:17599 (tcp)
2019/12/30 18:57:46 [INFO]  raft: Node at 127.0.0.1:17609 [Leader] entering Leader state
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.014170 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.014243 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_GRPC (4.99s)
=== CONT  TestAgent_AddCheck_MissingService
TestAgent_AddCheck_GRPC - 2019/12/30 18:57:46.012213 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:46.016984 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:46.017397 [INFO] consul: New leader elected: Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:46.092266 [WARN] agent: Node name "Node 9e184beb-18ab-1bfd-d779-9a04fc529186" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:46.092887 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:46.095633 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:80a38fc6-94d6-dc1e-80ef-88fbdc3702bf Address:127.0.0.1:17615}]
2019/12/30 18:57:46 [INFO]  raft: Node at 127.0.0.1:17615 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.201007 [INFO] serf: EventMemberJoin: Node 80a38fc6-94d6-dc1e-80ef-88fbdc3702bf.dc1 127.0.0.1
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.208736 [INFO] serf: EventMemberJoin: Node 80a38fc6-94d6-dc1e-80ef-88fbdc3702bf 127.0.0.1
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.210350 [INFO] consul: Adding LAN server Node 80a38fc6-94d6-dc1e-80ef-88fbdc3702bf (Addr: tcp/127.0.0.1:17615) (DC: dc1)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.210845 [INFO] consul: Handled member-join event for server "Node 80a38fc6-94d6-dc1e-80ef-88fbdc3702bf.dc1" in area "wan"
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.213566 [INFO] agent: Started DNS server 127.0.0.1:17610 (tcp)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.213688 [INFO] agent: Started DNS server 127.0.0.1:17610 (udp)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.218745 [INFO] agent: Started HTTP server on 127.0.0.1:17611 (tcp)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:46.218855 [INFO] agent: started state syncer
2019/12/30 18:57:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:46 [INFO]  raft: Node at 127.0.0.1:17615 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f52c6b5f-9cd9-0f2f-9715-ce543442f0d9 Address:127.0.0.1:17621}]
2019/12/30 18:57:46 [INFO]  raft: Node at 127.0.0.1:17621 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.276560 [INFO] serf: EventMemberJoin: Node f52c6b5f-9cd9-0f2f-9715-ce543442f0d9.dc1 127.0.0.1
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.283507 [INFO] serf: EventMemberJoin: Node f52c6b5f-9cd9-0f2f-9715-ce543442f0d9 127.0.0.1
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.292484 [INFO] consul: Handled member-join event for server "Node f52c6b5f-9cd9-0f2f-9715-ce543442f0d9.dc1" in area "wan"
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.293607 [INFO] agent: Started DNS server 127.0.0.1:17616 (tcp)
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.294284 [INFO] agent: Started DNS server 127.0.0.1:17616 (udp)
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.294905 [INFO] consul: Adding LAN server Node f52c6b5f-9cd9-0f2f-9715-ce543442f0d9 (Addr: tcp/127.0.0.1:17621) (DC: dc1)
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.296660 [INFO] agent: Started HTTP server on 127.0.0.1:17617 (tcp)
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:46.296758 [INFO] agent: started state syncer
2019/12/30 18:57:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:46 [INFO]  raft: Node at 127.0.0.1:17621 [Candidate] entering Candidate state in term 2
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:46.470727 [INFO] agent: Synced node info
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:46.470859 [DEBUG] agent: Node info in sync
2019/12/30 18:57:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:47 [INFO]  raft: Node at 127.0.0.1:17615 [Leader] entering Leader state
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.312718 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.313300 [INFO] consul: New leader elected: Node 80a38fc6-94d6-dc1e-80ef-88fbdc3702bf
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.371815 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.371908 [INFO] consul: shutting down server
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.371960 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.372317 [ERR] agent: failed to sync remote state: No cluster leader
jones - 2019/12/30 18:57:47.678993 [DEBUG] manager: Rebalanced 1 servers, next active server is Node f632792c-c81a-fbfb-b7c4-e99bdb454ade.dc1 (Addr: tcp/127.0.0.1:17740) (DC: dc1)
jones - 2019/12/30 18:57:47.772228 [DEBUG] consul: Skipping self join check for "Node 21fec6d0-1e2c-94e5-aea5-373e1f263aef" since the cluster is too small
2019/12/30 18:57:47 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:47 [INFO]  raft: Node at 127.0.0.1:17621 [Leader] entering Leader state
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:47.862768 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:47.863252 [INFO] consul: New leader elected: Node f52c6b5f-9cd9-0f2f-9715-ce543442f0d9
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.865446 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9e184beb-18ab-1bfd-d779-9a04fc529186 Address:127.0.0.1:17627}]
2019/12/30 18:57:47 [INFO]  raft: Node at 127.0.0.1:17627 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.949043 [INFO] serf: EventMemberJoin: Node 9e184beb-18ab-1bfd-d779-9a04fc529186.dc1 127.0.0.1
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:47.949864 [INFO] manager: shutting down
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.954257 [INFO] serf: EventMemberJoin: Node 9e184beb-18ab-1bfd-d779-9a04fc529186 127.0.0.1
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.955819 [INFO] consul: Adding LAN server Node 9e184beb-18ab-1bfd-d779-9a04fc529186 (Addr: tcp/127.0.0.1:17627) (DC: dc1)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.956913 [INFO] consul: Handled member-join event for server "Node 9e184beb-18ab-1bfd-d779-9a04fc529186.dc1" in area "wan"
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.959594 [INFO] agent: Started DNS server 127.0.0.1:17622 (tcp)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.959697 [INFO] agent: Started DNS server 127.0.0.1:17622 (udp)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.962560 [INFO] agent: Started HTTP server on 127.0.0.1:17623 (tcp)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:47.962893 [INFO] agent: started state syncer
2019/12/30 18:57:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:47 [INFO]  raft: Node at 127.0.0.1:17627 [Candidate] entering Candidate state in term 2
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137037 [INFO] agent: consul server down
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137115 [INFO] agent: shutdown complete
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137173 [INFO] agent: Stopping DNS server 127.0.0.1:17610 (tcp)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137340 [INFO] agent: Stopping DNS server 127.0.0.1:17610 (udp)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137532 [INFO] agent: Stopping HTTP server 127.0.0.1:17611 (tcp)
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137770 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.137890 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_ExecDisable (4.17s)
=== CONT  TestAgent_AddCheck_MinInterval
TestAgent_AddCheck_ExecDisable - 2019/12/30 18:57:48.150305 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.245660 [INFO] agent: Synced node info
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.246264 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.246342 [INFO] consul: shutting down server
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.246413 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:48.252265 [WARN] agent: Node name "Node ff09fd96-f40f-20ab-50e8-07ee16fe3d49" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:48.252869 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:48.255386 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.339191 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.439154 [INFO] manager: shutting down
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.736707 [INFO] agent: consul server down
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.736778 [INFO] agent: shutdown complete
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.736835 [INFO] agent: Stopping DNS server 127.0.0.1:17616 (tcp)
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.736976 [INFO] agent: Stopping DNS server 127.0.0.1:17616 (udp)
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.737135 [INFO] agent: Stopping HTTP server 127.0.0.1:17617 (tcp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:48.737170 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.737333 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.737404 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_RestoreState (4.10s)
=== CONT  TestAgent_AddCheck_StartPassing
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:48.737599 [DEBUG] consul: Skipping self join check for "Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd" since the cluster is too small
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:48.737747 [INFO] consul: member 'Node 328126e5-27dd-1833-8ea3-e65e5d25fcfd' joined, marking health alive
2019/12/30 18:57:48 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:48 [INFO]  raft: Node at 127.0.0.1:17627 [Leader] entering Leader state
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.746478 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_AddCheck_RestoreState - 2019/12/30 18:57:48.746761 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.747385 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.747751 [INFO] consul: New leader elected: Node 9e184beb-18ab-1bfd-d779-9a04fc529186
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:48.814130 [WARN] agent: Node name "Node d177d3be-99cd-4f1a-c5d7-b298a8bd6c8d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:48.814668 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:48.817220 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.830873 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.830967 [INFO] consul: shutting down server
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.831015 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.831399 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:48.939151 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.054609 [INFO] manager: shutting down
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.067083 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.067182 [INFO] consul: shutting down server
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.067235 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.145602 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.261709 [INFO] agent: consul server down
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.261780 [INFO] agent: shutdown complete
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.261840 [INFO] agent: Stopping DNS server 127.0.0.1:17622 (tcp)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.261967 [INFO] agent: Stopping DNS server 127.0.0.1:17622 (udp)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.262114 [INFO] agent: Stopping HTTP server 127.0.0.1:17623 (tcp)
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.262208 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.262303 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_MissingService - 2019/12/30 18:57:49.262356 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_MissingService (3.25s)
=== CONT  TestAgent_AddCheck
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.266174 [INFO] manager: shutting down
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.266595 [INFO] agent: consul server down
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.266649 [INFO] agent: shutdown complete
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.266706 [INFO] agent: Stopping DNS server 127.0.0.1:17604 (tcp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.266840 [INFO] agent: Stopping DNS server 127.0.0.1:17604 (udp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.266986 [INFO] agent: Stopping HTTP server 127.0.0.1:17605 (tcp)
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.267173 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_ExecRemoteDisable - 2019/12/30 18:57:49.267252 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_ExecRemoteDisable (7.00s)
=== CONT  TestAgent_IndexChurn
=== CONT  TestAgent_RemoveServiceRemovesAllChecks
--- PASS: TestAgent_IndexChurn (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:49.560633 [DEBUG] tlsutil: Update with version 1
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:49.563602 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:49.626008 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 5122c9d8-8979-c841-956f-094a90e62880.dc1 (Addr: tcp/127.0.0.1:17722) (DC: dc1)
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddCheck - 2019/12/30 18:57:49.693771 [WARN] agent: Node name "Node 2068fad4-8c7a-1cc7-4f41-a9720918d5c1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_AddCheck - 2019/12/30 18:57:49.694326 [DEBUG] tlsutil: Update with version 1
TestAgent_AddCheck - 2019/12/30 18:57:49.697399 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ff09fd96-f40f-20ab-50e8-07ee16fe3d49 Address:127.0.0.1:17633}]
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.770785 [INFO] serf: EventMemberJoin: Node ff09fd96-f40f-20ab-50e8-07ee16fe3d49.dc1 127.0.0.1
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.773808 [INFO] serf: EventMemberJoin: Node ff09fd96-f40f-20ab-50e8-07ee16fe3d49 127.0.0.1
2019/12/30 18:57:49 [INFO]  raft: Node at 127.0.0.1:17633 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.779290 [INFO] consul: Adding LAN server Node ff09fd96-f40f-20ab-50e8-07ee16fe3d49 (Addr: tcp/127.0.0.1:17633) (DC: dc1)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.779895 [INFO] consul: Handled member-join event for server "Node ff09fd96-f40f-20ab-50e8-07ee16fe3d49.dc1" in area "wan"
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.780167 [INFO] agent: Started DNS server 127.0.0.1:17628 (tcp)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.784778 [INFO] agent: Started DNS server 127.0.0.1:17628 (udp)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.787252 [INFO] agent: Started HTTP server on 127.0.0.1:17629 (tcp)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:49.787349 [INFO] agent: started state syncer
2019/12/30 18:57:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:49 [INFO]  raft: Node at 127.0.0.1:17633 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:57:49.887674 [DEBUG] consul: Skipping self join check for "Node b8c654fc-da2b-ce2a-7f10-53f103af6c8d" since the cluster is too small
2019/12/30 18:57:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d177d3be-99cd-4f1a-c5d7-b298a8bd6c8d Address:127.0.0.1:17639}]
2019/12/30 18:57:50 [INFO]  raft: Node at 127.0.0.1:17639 [Follower] entering Follower state (Leader: "")
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.242405 [INFO] serf: EventMemberJoin: Node d177d3be-99cd-4f1a-c5d7-b298a8bd6c8d.dc1 127.0.0.1
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.247169 [INFO] serf: EventMemberJoin: Node d177d3be-99cd-4f1a-c5d7-b298a8bd6c8d 127.0.0.1
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.247996 [INFO] consul: Handled member-join event for server "Node d177d3be-99cd-4f1a-c5d7-b298a8bd6c8d.dc1" in area "wan"
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.248076 [INFO] consul: Adding LAN server Node d177d3be-99cd-4f1a-c5d7-b298a8bd6c8d (Addr: tcp/127.0.0.1:17639) (DC: dc1)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.248419 [INFO] agent: Started DNS server 127.0.0.1:17634 (udp)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.248495 [INFO] agent: Started DNS server 127.0.0.1:17634 (tcp)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.250798 [INFO] agent: Started HTTP server on 127.0.0.1:17635 (tcp)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:50.250897 [INFO] agent: started state syncer
2019/12/30 18:57:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:50 [INFO]  raft: Node at 127.0.0.1:17639 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:50 [INFO]  raft: Node at 127.0.0.1:17633 [Leader] entering Leader state
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.795152 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.795617 [INFO] consul: New leader elected: Node ff09fd96-f40f-20ab-50e8-07ee16fe3d49
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.954883 [WARN] agent: check 'mem' has interval below minimum of 1s
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.955034 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.955093 [INFO] consul: shutting down server
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.955138 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:50.955238 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.153176 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.253200 [INFO] manager: shutting down
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.253478 [INFO] agent: consul server down
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.253528 [INFO] agent: shutdown complete
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.253581 [INFO] agent: Stopping DNS server 127.0.0.1:17628 (tcp)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.253708 [INFO] agent: Stopping DNS server 127.0.0.1:17628 (udp)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.253859 [INFO] agent: Stopping HTTP server 127.0.0.1:17629 (tcp)
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.254080 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_MinInterval - 2019/12/30 18:57:51.254147 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_MinInterval (3.12s)
=== CONT  TestAgent_RemoveService
2019/12/30 18:57:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2068fad4-8c7a-1cc7-4f41-a9720918d5c1 Address:127.0.0.1:17645}]
2019/12/30 18:57:51 [INFO]  raft: Node at 127.0.0.1:17645 [Follower] entering Follower state (Leader: "")
2019/12/30 18:57:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2fb0aac5-b23a-3b50-2cce-6803071353de Address:127.0.0.1:17651}]
TestAgent_AddCheck - 2019/12/30 18:57:51.262333 [INFO] serf: EventMemberJoin: Node 2068fad4-8c7a-1cc7-4f41-a9720918d5c1.dc1 127.0.0.1
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.263755 [INFO] serf: EventMemberJoin: node1.dc1 127.0.0.1
2019/12/30 18:57:51 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:51 [INFO]  raft: Node at 127.0.0.1:17639 [Leader] entering Leader state
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.269562 [INFO] agent: Requesting shutdown
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.269659 [INFO] consul: shutting down server
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.269711 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.269872 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.270107 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/30 18:57:51 [INFO]  raft: Node at 127.0.0.1:17651 [Follower] entering Follower state (Leader: "")
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.274289 [INFO] serf: EventMemberJoin: node1 127.0.0.1
TestAgent_AddCheck - 2019/12/30 18:57:51.274499 [INFO] serf: EventMemberJoin: Node 2068fad4-8c7a-1cc7-4f41-a9720918d5c1 127.0.0.1
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.276089 [INFO] consul: Adding LAN server node1 (Addr: tcp/127.0.0.1:17651) (DC: dc1)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.276671 [INFO] consul: Handled member-join event for server "node1.dc1" in area "wan"
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.276722 [INFO] agent: Started DNS server 127.0.0.1:17646 (udp)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.277141 [INFO] agent: Started DNS server 127.0.0.1:17646 (tcp)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.279869 [INFO] agent: Started HTTP server on 127.0.0.1:17647 (tcp)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:51.279972 [INFO] agent: started state syncer
TestAgent_AddCheck - 2019/12/30 18:57:51.280559 [INFO] consul: Adding LAN server Node 2068fad4-8c7a-1cc7-4f41-a9720918d5c1 (Addr: tcp/127.0.0.1:17645) (DC: dc1)
TestAgent_AddCheck - 2019/12/30 18:57:51.280825 [INFO] consul: Handled member-join event for server "Node 2068fad4-8c7a-1cc7-4f41-a9720918d5c1.dc1" in area "wan"
TestAgent_AddCheck - 2019/12/30 18:57:51.281346 [INFO] agent: Started DNS server 127.0.0.1:17640 (udp)
TestAgent_AddCheck - 2019/12/30 18:57:51.281408 [INFO] agent: Started DNS server 127.0.0.1:17640 (tcp)
TestAgent_AddCheck - 2019/12/30 18:57:51.283728 [INFO] agent: Started HTTP server on 127.0.0.1:17641 (tcp)
TestAgent_AddCheck - 2019/12/30 18:57:51.283818 [INFO] agent: started state syncer
2019/12/30 18:57:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:51 [INFO]  raft: Node at 127.0.0.1:17645 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_RemoveService - 2019/12/30 18:57:51.321314 [WARN] agent: Node name "Node 43c9c1fe-4282-b67b-e72f-5147bdb1ad63" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_RemoveService - 2019/12/30 18:57:51.321799 [DEBUG] tlsutil: Update with version 1
TestAgent_RemoveService - 2019/12/30 18:57:51.324035 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:51 [INFO]  raft: Node at 127.0.0.1:17651 [Candidate] entering Candidate state in term 2
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.461605 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.580388 [INFO] manager: shutting down
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805061 [INFO] agent: consul server down
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805159 [INFO] agent: shutdown complete
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805222 [INFO] agent: Stopping DNS server 127.0.0.1:17634 (tcp)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805400 [INFO] agent: Stopping DNS server 127.0.0.1:17634 (udp)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805586 [INFO] agent: Stopping HTTP server 127.0.0.1:17635 (tcp)
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805829 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.805920 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck_StartPassing (3.07s)
=== CONT  TestAgent_AddServiceNoRemoteExec
TestAgent_AddCheck_StartPassing - 2019/12/30 18:57:51.807143 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:51.874571 [DEBUG] tlsutil: Update with version 1
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:51.877662 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:52 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:52 [INFO]  raft: Node at 127.0.0.1:17651 [Leader] entering Leader state
2019/12/30 18:57:52 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:52 [INFO]  raft: Node at 127.0.0.1:17645 [Leader] entering Leader state
TestAgent_AddCheck - 2019/12/30 18:57:52.031536 [INFO] consul: cluster leadership acquired
TestAgent_AddCheck - 2019/12/30 18:57:52.031982 [INFO] consul: New leader elected: Node 2068fad4-8c7a-1cc7-4f41-a9720918d5c1
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.032213 [INFO] consul: cluster leadership acquired
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.032527 [INFO] consul: New leader elected: node1
TestAgent_AddCheck - 2019/12/30 18:57:52.180359 [INFO] agent: Requesting shutdown
TestAgent_AddCheck - 2019/12/30 18:57:52.180482 [INFO] consul: shutting down server
TestAgent_AddCheck - 2019/12/30 18:57:52.180530 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck - 2019/12/30 18:57:52.180965 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_AddCheck - 2019/12/30 18:57:52.361525 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck - 2019/12/30 18:57:52.461735 [INFO] manager: shutting down
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.471018 [INFO] agent: Synced node info
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.473113 [DEBUG] agent: removed check "chk1"
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.473225 [DEBUG] agent: removed check "chk2"
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.473275 [DEBUG] agent: removed service "redis"
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.473356 [INFO] agent: Requesting shutdown
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.473407 [INFO] consul: shutting down server
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.473474 [WARN] serf: Shutdown without a Leave
TestAgent_AddCheck - 2019/12/30 18:57:52.569957 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_AddCheck - 2019/12/30 18:57:52.570308 [INFO] agent: consul server down
TestAgent_AddCheck - 2019/12/30 18:57:52.570366 [INFO] agent: shutdown complete
TestAgent_AddCheck - 2019/12/30 18:57:52.570428 [INFO] agent: Stopping DNS server 127.0.0.1:17640 (tcp)
TestAgent_AddCheck - 2019/12/30 18:57:52.570601 [INFO] agent: Stopping DNS server 127.0.0.1:17640 (udp)
TestAgent_AddCheck - 2019/12/30 18:57:52.570792 [INFO] agent: Stopping HTTP server 127.0.0.1:17641 (tcp)
TestAgent_AddCheck - 2019/12/30 18:57:52.571066 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddCheck - 2019/12/30 18:57:52.571153 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddCheck (3.31s)
=== CONT  TestAgent_AddServiceNoExec
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.571718 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddServiceNoExec - 2019/12/30 18:57:52.644025 [DEBUG] tlsutil: Update with version 1
TestAgent_AddServiceNoExec - 2019/12/30 18:57:52.646177 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.695944 [INFO] manager: shutting down
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.696305 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.696537 [INFO] agent: consul server down
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.696592 [INFO] agent: shutdown complete
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.696649 [INFO] agent: Stopping DNS server 127.0.0.1:17646 (tcp)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.696806 [INFO] agent: Stopping DNS server 127.0.0.1:17646 (udp)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.696973 [INFO] agent: Stopping HTTP server 127.0.0.1:17647 (tcp)
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.697193 [INFO] agent: Waiting for endpoints to shut down
TestAgent_RemoveServiceRemovesAllChecks - 2019/12/30 18:57:52.697281 [INFO] agent: Endpoints down
--- PASS: TestAgent_RemoveServiceRemovesAllChecks (3.43s)
=== CONT  TestAgent_AddService
2019/12/30 18:57:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:43c9c1fe-4282-b67b-e72f-5147bdb1ad63 Address:127.0.0.1:17657}]
2019/12/30 18:57:52 [INFO]  raft: Node at 127.0.0.1:17657 [Follower] entering Follower state (Leader: "")
TestAgent_RemoveService - 2019/12/30 18:57:52.705974 [INFO] serf: EventMemberJoin: Node 43c9c1fe-4282-b67b-e72f-5147bdb1ad63.dc1 127.0.0.1
TestAgent_RemoveService - 2019/12/30 18:57:52.711216 [INFO] serf: EventMemberJoin: Node 43c9c1fe-4282-b67b-e72f-5147bdb1ad63 127.0.0.1
TestAgent_RemoveService - 2019/12/30 18:57:52.712109 [INFO] consul: Adding LAN server Node 43c9c1fe-4282-b67b-e72f-5147bdb1ad63 (Addr: tcp/127.0.0.1:17657) (DC: dc1)
TestAgent_RemoveService - 2019/12/30 18:57:52.712810 [INFO] consul: Handled member-join event for server "Node 43c9c1fe-4282-b67b-e72f-5147bdb1ad63.dc1" in area "wan"
TestAgent_RemoveService - 2019/12/30 18:57:52.714453 [INFO] agent: Started DNS server 127.0.0.1:17652 (tcp)
TestAgent_RemoveService - 2019/12/30 18:57:52.714860 [INFO] agent: Started DNS server 127.0.0.1:17652 (udp)
TestAgent_RemoveService - 2019/12/30 18:57:52.717359 [INFO] agent: Started HTTP server on 127.0.0.1:17653 (tcp)
TestAgent_RemoveService - 2019/12/30 18:57:52.717468 [INFO] agent: started state syncer
2019/12/30 18:57:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:52 [INFO]  raft: Node at 127.0.0.1:17657 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_AddService - 2019/12/30 18:57:52.774285 [DEBUG] tlsutil: Update with version 1
TestAgent_AddService - 2019/12/30 18:57:52.778220 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2db33c59-4d6b-6597-2071-5eb1c6071f7b Address:127.0.0.1:17663}]
2019/12/30 18:57:53 [INFO]  raft: Node at 127.0.0.1:17663 [Follower] entering Follower state (Leader: "")
jones - 2019/12/30 18:57:53.377582 [DEBUG] consul: Skipping self join check for "Node 5122c9d8-8979-c841-956f-094a90e62880" since the cluster is too small
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.397626 [INFO] serf: EventMemberJoin: node1.dc1 127.0.0.1
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.404038 [INFO] serf: EventMemberJoin: node1 127.0.0.1
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.405411 [INFO] consul: Adding LAN server node1 (Addr: tcp/127.0.0.1:17663) (DC: dc1)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.405906 [INFO] consul: Handled member-join event for server "node1.dc1" in area "wan"
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.407757 [INFO] agent: Started DNS server 127.0.0.1:17658 (tcp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.407845 [INFO] agent: Started DNS server 127.0.0.1:17658 (udp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.410730 [INFO] agent: Started HTTP server on 127.0.0.1:17659 (tcp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:53.410860 [INFO] agent: started state syncer
2019/12/30 18:57:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:53 [INFO]  raft: Node at 127.0.0.1:17663 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:53 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:53 [INFO]  raft: Node at 127.0.0.1:17657 [Leader] entering Leader state
TestAgent_RemoveService - 2019/12/30 18:57:53.616153 [INFO] consul: cluster leadership acquired
TestAgent_RemoveService - 2019/12/30 18:57:53.616630 [INFO] consul: New leader elected: Node 43c9c1fe-4282-b67b-e72f-5147bdb1ad63
2019/12/30 18:57:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a2295fed-cf38-220a-7a36-e7bb554b2d8d Address:127.0.0.1:17669}]
2019/12/30 18:57:53 [INFO]  raft: Node at 127.0.0.1:17669 [Follower] entering Follower state (Leader: "")
TestAgent_RemoveService - 2019/12/30 18:57:53.998068 [INFO] agent: Synced node info
TestAgent_RemoveService - 2019/12/30 18:57:53.998246 [WARN] agent: Failed to deregister service "redis": Service "redis" does not exist
TestAgent_RemoveService - 2019/12/30 18:57:53.998695 [DEBUG] agent: removed check "service:memcache"
TestAgent_RemoveService - 2019/12/30 18:57:53.998761 [DEBUG] agent: removed check "check2"
TestAgent_RemoveService - 2019/12/30 18:57:53.998801 [DEBUG] agent: removed service "memcache"
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.002015 [INFO] serf: EventMemberJoin: node1.dc1 127.0.0.1
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.005434 [INFO] serf: EventMemberJoin: node1 127.0.0.1
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.006958 [INFO] agent: Started DNS server 127.0.0.1:17664 (udp)
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.007410 [INFO] consul: Handled member-join event for server "node1.dc1" in area "wan"
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.007845 [INFO] consul: Adding LAN server node1 (Addr: tcp/127.0.0.1:17669) (DC: dc1)
TestAgent_RemoveService - 2019/12/30 18:57:54.008133 [DEBUG] agent: removed check "service:redis:1"
TestAgent_RemoveService - 2019/12/30 18:57:54.008216 [DEBUG] agent: removed check "service:redis:2"
TestAgent_RemoveService - 2019/12/30 18:57:54.008255 [DEBUG] agent: removed service "redis"
TestAgent_RemoveService - 2019/12/30 18:57:54.008361 [INFO] agent: Requesting shutdown
TestAgent_RemoveService - 2019/12/30 18:57:54.008428 [INFO] consul: shutting down server
TestAgent_RemoveService - 2019/12/30 18:57:54.008473 [WARN] serf: Shutdown without a Leave
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.010411 [INFO] agent: Started DNS server 127.0.0.1:17664 (tcp)
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.012670 [INFO] agent: Started HTTP server on 127.0.0.1:17665 (tcp)
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.012782 [INFO] agent: started state syncer
2019/12/30 18:57:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:54 [INFO]  raft: Node at 127.0.0.1:17669 [Candidate] entering Candidate state in term 2
TestAgent_RemoveService - 2019/12/30 18:57:54.128205 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:54 [INFO]  raft: Node at 127.0.0.1:17663 [Leader] entering Leader state
2019/12/30 18:57:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1f71570d-2dee-503b-4348-7797d589e0c2 Address:127.0.0.1:17675}]
TestAgent_AddService - 2019/12/30 18:57:54.137840 [INFO] serf: EventMemberJoin: node1.dc1 127.0.0.1
TestAgent_AddService - 2019/12/30 18:57:54.142694 [INFO] serf: EventMemberJoin: node1 127.0.0.1
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:54.143562 [INFO] consul: cluster leadership acquired
TestAgent_AddService - 2019/12/30 18:57:54.143977 [INFO] agent: Started DNS server 127.0.0.1:17670 (udp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:54.144082 [INFO] consul: New leader elected: node1
2019/12/30 18:57:54 [INFO]  raft: Node at 127.0.0.1:17675 [Follower] entering Follower state (Leader: "")
TestAgent_AddService - 2019/12/30 18:57:54.145350 [INFO] consul: Handled member-join event for server "node1.dc1" in area "wan"
TestAgent_AddService - 2019/12/30 18:57:54.145984 [INFO] agent: Started DNS server 127.0.0.1:17670 (tcp)
TestAgent_AddService - 2019/12/30 18:57:54.146080 [INFO] consul: Adding LAN server node1 (Addr: tcp/127.0.0.1:17675) (DC: dc1)
TestAgent_AddService - 2019/12/30 18:57:54.148329 [INFO] agent: Started HTTP server on 127.0.0.1:17671 (tcp)
TestAgent_AddService - 2019/12/30 18:57:54.148421 [INFO] agent: started state syncer
2019/12/30 18:57:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:54 [INFO]  raft: Node at 127.0.0.1:17675 [Candidate] entering Candidate state in term 2
TestAgent_RemoveService - 2019/12/30 18:57:54.278310 [INFO] manager: shutting down
TestAgent_RemoveService - 2019/12/30 18:57:54.279376 [INFO] agent: consul server down
TestAgent_RemoveService - 2019/12/30 18:57:54.279508 [INFO] agent: shutdown complete
TestAgent_RemoveService - 2019/12/30 18:57:54.279573 [INFO] agent: Stopping DNS server 127.0.0.1:17652 (tcp)
TestAgent_RemoveService - 2019/12/30 18:57:54.279737 [INFO] agent: Stopping DNS server 127.0.0.1:17652 (udp)
TestAgent_RemoveService - 2019/12/30 18:57:54.279911 [INFO] agent: Stopping HTTP server 127.0.0.1:17653 (tcp)
TestAgent_RemoveService - 2019/12/30 18:57:54.280170 [INFO] agent: Waiting for endpoints to shut down
TestAgent_RemoveService - 2019/12/30 18:57:54.280254 [INFO] agent: Endpoints down
--- PASS: TestAgent_RemoveService (3.03s)
=== CONT  TestAgent_makeNodeID
TestAgent_RemoveService - 2019/12/30 18:57:54.283062 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_RemoveService - 2019/12/30 18:57:54.283371 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_makeNodeID - 2019/12/30 18:57:54.356673 [DEBUG] agent: Using random ID "394d2fc4-91da-abb6-a604-fd39f201e7c6" as node ID
TestAgent_makeNodeID - 2019/12/30 18:57:54.358339 [WARN] agent: Node name "Node 7416007d-34ae-1eb0-3a46-9796188625f8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_makeNodeID - 2019/12/30 18:57:54.359293 [DEBUG] tlsutil: Update with version 1
TestAgent_makeNodeID - 2019/12/30 18:57:54.361725 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:54.579265 [INFO] agent: Synced node info
2019/12/30 18:57:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:54 [INFO]  raft: Node at 127.0.0.1:17669 [Leader] entering Leader state
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.873556 [INFO] consul: cluster leadership acquired
TestAgent_AddServiceNoExec - 2019/12/30 18:57:54.874029 [INFO] consul: New leader elected: node1
2019/12/30 18:57:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:54 [INFO]  raft: Node at 127.0.0.1:17675 [Leader] entering Leader state
TestAgent_AddService - 2019/12/30 18:57:54.978760 [INFO] consul: cluster leadership acquired
TestAgent_AddService - 2019/12/30 18:57:54.979206 [INFO] consul: New leader elected: node1
TestAgent_AddService - 2019/12/30 18:57:55.151053 [INFO] agent: Requesting shutdown
TestAgent_AddService - 2019/12/30 18:57:55.151140 [INFO] consul: shutting down server
TestAgent_AddService - 2019/12/30 18:57:55.151198 [WARN] serf: Shutdown without a Leave
TestAgent_AddService - 2019/12/30 18:57:55.263843 [WARN] serf: Shutdown without a Leave
TestAgent_AddServiceNoExec - 2019/12/30 18:57:55.362650 [INFO] agent: Synced node info
TestAgent_AddServiceNoExec - 2019/12/30 18:57:55.362920 [DEBUG] agent: Node info in sync
TestAgent_AddService - 2019/12/30 18:57:55.367212 [INFO] manager: shutting down
TestAgent_AddService - 2019/12/30 18:57:55.370489 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_AddService - 2019/12/30 18:57:55.370489 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_AddService - 2019/12/30 18:57:55.370684 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_AddService - 2019/12/30 18:57:55.371229 [INFO] agent: consul server down
TestAgent_AddService - 2019/12/30 18:57:55.371304 [INFO] agent: shutdown complete
TestAgent_AddService - 2019/12/30 18:57:55.371633 [INFO] agent: Stopping DNS server 127.0.0.1:17670 (tcp)
TestAgent_AddService - 2019/12/30 18:57:55.371993 [INFO] agent: Stopping DNS server 127.0.0.1:17670 (udp)
TestAgent_AddService - 2019/12/30 18:57:55.372344 [INFO] agent: Stopping HTTP server 127.0.0.1:17671 (tcp)
TestAgent_AddService - 2019/12/30 18:57:55.372718 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddService - 2019/12/30 18:57:55.372855 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddService (2.68s)
=== CONT  TestAgent_setupNodeID
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_setupNodeID - 2019/12/30 18:57:55.440368 [DEBUG] agent: Using random ID "ef8a2489-8c44-2c64-4594-629f9a11a485" as node ID
TestAgent_setupNodeID - 2019/12/30 18:57:55.441084 [WARN] agent: Node name "Node 40dc1beb-50d0-6d1c-cdf8-f77dc771a4cc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_setupNodeID - 2019/12/30 18:57:55.441653 [DEBUG] tlsutil: Update with version 1
TestAgent_setupNodeID - 2019/12/30 18:57:55.444053 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:394d2fc4-91da-abb6-a604-fd39f201e7c6 Address:127.0.0.1:17681}]
2019/12/30 18:57:55 [INFO]  raft: Node at 127.0.0.1:17681 [Follower] entering Follower state (Leader: "")
TestAgent_makeNodeID - 2019/12/30 18:57:55.730444 [INFO] serf: EventMemberJoin: Node 7416007d-34ae-1eb0-3a46-9796188625f8.dc1 127.0.0.1
TestAgent_makeNodeID - 2019/12/30 18:57:55.740699 [INFO] serf: EventMemberJoin: Node 7416007d-34ae-1eb0-3a46-9796188625f8 127.0.0.1
TestAgent_makeNodeID - 2019/12/30 18:57:55.741191 [INFO] consul: Handled member-join event for server "Node 7416007d-34ae-1eb0-3a46-9796188625f8.dc1" in area "wan"
TestAgent_makeNodeID - 2019/12/30 18:57:55.741797 [INFO] agent: Started DNS server 127.0.0.1:17676 (tcp)
TestAgent_makeNodeID - 2019/12/30 18:57:55.741875 [INFO] agent: Started DNS server 127.0.0.1:17676 (udp)
TestAgent_makeNodeID - 2019/12/30 18:57:55.742180 [INFO] consul: Adding LAN server Node 7416007d-34ae-1eb0-3a46-9796188625f8 (Addr: tcp/127.0.0.1:17681) (DC: dc1)
TestAgent_makeNodeID - 2019/12/30 18:57:55.744314 [INFO] agent: Started HTTP server on 127.0.0.1:17677 (tcp)
TestAgent_makeNodeID - 2019/12/30 18:57:55.744453 [INFO] agent: started state syncer
2019/12/30 18:57:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:55 [INFO]  raft: Node at 127.0.0.1:17681 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:57:55.863770 [DEBUG] consul: Skipping self join check for "Node a8b3e297-b53a-bcd0-efda-5addcd938805" since the cluster is too small
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.145728 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.146232 [DEBUG] consul: Skipping self join check for "node1" since the cluster is too small
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.146430 [INFO] consul: member 'node1' joined, marking health alive
2019/12/30 18:57:56 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:56 [INFO]  raft: Node at 127.0.0.1:17681 [Leader] entering Leader state
TestAgent_makeNodeID - 2019/12/30 18:57:56.497436 [INFO] consul: cluster leadership acquired
TestAgent_makeNodeID - 2019/12/30 18:57:56.497839 [INFO] consul: New leader elected: Node 7416007d-34ae-1eb0-3a46-9796188625f8
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.514469 [INFO] agent: Requesting shutdown
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.514562 [INFO] consul: shutting down server
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.514615 [WARN] serf: Shutdown without a Leave
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.664147 [WARN] serf: Shutdown without a Leave
TestAgent_makeNodeID - 2019/12/30 18:57:56.687410 [DEBUG] agent: Using random ID "06dcc8ea-caa4-ac3e-0fae-c1a72d75a777" as node ID
TestAgent_makeNodeID - 2019/12/30 18:57:56.687520 [DEBUG] agent: Using random ID "9d3af4eb-ce8c-4c3f-2f00-5683bde3ffa3" as node ID
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.760206 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgent_AddServiceNoExec - 2019/12/30 18:57:56.784811 [DEBUG] agent: Node info in sync
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:56.792697 [INFO] manager: shutting down
2019/12/30 18:57:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ef8a2489-8c44-2c64-4594-629f9a11a485 Address:127.0.0.1:17689}]
TestAgent_makeNodeID - 2019/12/30 18:57:57.071333 [INFO] agent: Synced node info
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.074223 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.074765 [DEBUG] consul: Skipping self join check for "node1" since the cluster is too small
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.074943 [INFO] consul: member 'node1' joined, marking health alive
TestAgent_setupNodeID - 2019/12/30 18:57:57.079850 [INFO] serf: EventMemberJoin: Node 40dc1beb-50d0-6d1c-cdf8-f77dc771a4cc.dc1 127.0.0.1
2019/12/30 18:57:57 [INFO]  raft: Node at 127.0.0.1:17689 [Follower] entering Follower state (Leader: "")
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081114 [INFO] agent: consul server down
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081177 [INFO] agent: shutdown complete
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081247 [INFO] agent: Stopping DNS server 127.0.0.1:17658 (tcp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081425 [INFO] agent: Stopping DNS server 127.0.0.1:17658 (udp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081587 [INFO] agent: Stopping HTTP server 127.0.0.1:17659 (tcp)
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081822 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.081897 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddServiceNoRemoteExec (5.28s)
=== CONT  TestAgent_ReconnectConfigWanDisabled
TestAgent_setupNodeID - 2019/12/30 18:57:57.093622 [INFO] serf: EventMemberJoin: Node 40dc1beb-50d0-6d1c-cdf8-f77dc771a4cc 127.0.0.1
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.096023 [WARN] agent: Deregistering service "svcid1" failed. leadership lost while committing log
TestAgent_AddServiceNoRemoteExec - 2019/12/30 18:57:57.096103 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_setupNodeID - 2019/12/30 18:57:57.096495 [INFO] consul: Adding LAN server Node 40dc1beb-50d0-6d1c-cdf8-f77dc771a4cc (Addr: tcp/127.0.0.1:17689) (DC: dc1)
TestAgent_setupNodeID - 2019/12/30 18:57:57.096853 [INFO] consul: Handled member-join event for server "Node 40dc1beb-50d0-6d1c-cdf8-f77dc771a4cc.dc1" in area "wan"
TestAgent_setupNodeID - 2019/12/30 18:57:57.099485 [INFO] agent: Started DNS server 127.0.0.1:17682 (tcp)
TestAgent_setupNodeID - 2019/12/30 18:57:57.099607 [INFO] agent: Started DNS server 127.0.0.1:17682 (udp)
TestAgent_setupNodeID - 2019/12/30 18:57:57.102725 [INFO] agent: Started HTTP server on 127.0.0.1:17683 (tcp)
TestAgent_setupNodeID - 2019/12/30 18:57:57.102837 [INFO] agent: started state syncer
2019/12/30 18:57:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:57 [INFO]  raft: Node at 127.0.0.1:17689 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:57.322867 [WARN] agent: Node name "Node af0eaeb5-dc9d-ed33-96f4-e526ba42e366" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:57.323540 [DEBUG] tlsutil: Update with version 1
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:57.326590 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.571054 [INFO] agent: Requesting shutdown
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.571133 [INFO] consul: shutting down server
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.571179 [WARN] serf: Shutdown without a Leave
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.737999 [WARN] serf: Shutdown without a Leave
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.850041 [INFO] manager: shutting down
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.851897 [INFO] agent: consul server down
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.851964 [INFO] agent: shutdown complete
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.852091 [INFO] agent: Stopping DNS server 127.0.0.1:17664 (tcp)
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.852268 [INFO] agent: Stopping DNS server 127.0.0.1:17664 (udp)
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.852472 [INFO] agent: Stopping HTTP server 127.0.0.1:17665 (tcp)
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.852746 [INFO] agent: Waiting for endpoints to shut down
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.852839 [INFO] agent: Endpoints down
--- PASS: TestAgent_AddServiceNoExec (5.28s)
=== CONT  TestAgent_ReconnectConfigSettings
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.853521 [WARN] agent: Deregistering service "svcid1" failed. leadership lost while committing log
TestAgent_AddServiceNoExec - 2019/12/30 18:57:57.853598 [ERR] agent: failed to sync changes: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:57.931771 [WARN] agent: Node name "Node c2ed98f4-cde8-c4ee-13bf-e08efaad8330" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:57.932544 [DEBUG] tlsutil: Update with version 1
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:57.935534 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_makeNodeID - 2019/12/30 18:57:58.262087 [DEBUG] agent: Using unique ID "e063db70-ccf9-0358-7357-2df7ecc92055" from host as node ID
TestAgent_makeNodeID - 2019/12/30 18:57:58.312571 [DEBUG] agent: Using unique ID "e063db70-ccf9-0358-7357-2df7ecc92055" from host as node ID
TestAgent_makeNodeID - 2019/12/30 18:57:58.312663 [INFO] agent: Requesting shutdown
TestAgent_makeNodeID - 2019/12/30 18:57:58.312758 [INFO] consul: shutting down server
TestAgent_makeNodeID - 2019/12/30 18:57:58.312829 [WARN] serf: Shutdown without a Leave
2019/12/30 18:57:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:58 [INFO]  raft: Node at 127.0.0.1:17689 [Leader] entering Leader state
jones - 2019/12/30 18:57:58.323797 [DEBUG] consul: Skipping self join check for "Node 4707dfb8-f5da-73eb-b8a2-b66f848ceb6d" since the cluster is too small
TestAgent_setupNodeID - 2019/12/30 18:57:58.324139 [INFO] consul: cluster leadership acquired
TestAgent_setupNodeID - 2019/12/30 18:57:58.324682 [INFO] consul: New leader elected: Node 40dc1beb-50d0-6d1c-cdf8-f77dc771a4cc
TestAgent_makeNodeID - 2019/12/30 18:57:58.448331 [WARN] serf: Shutdown without a Leave
TestAgent_setupNodeID - 2019/12/30 18:57:58.648043 [INFO] agent: Requesting shutdown
TestAgent_setupNodeID - 2019/12/30 18:57:58.648151 [INFO] consul: shutting down server
TestAgent_setupNodeID - 2019/12/30 18:57:58.648206 [WARN] serf: Shutdown without a Leave
TestAgent_makeNodeID - 2019/12/30 18:57:58.662248 [INFO] manager: shutting down
TestAgent_makeNodeID - 2019/12/30 18:57:58.692505 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_setupNodeID - 2019/12/30 18:57:58.961712 [WARN] serf: Shutdown without a Leave
TestAgent_makeNodeID - 2019/12/30 18:57:58.962193 [INFO] agent: consul server down
TestAgent_makeNodeID - 2019/12/30 18:57:58.962261 [INFO] agent: shutdown complete
TestAgent_makeNodeID - 2019/12/30 18:57:58.962315 [INFO] agent: Stopping DNS server 127.0.0.1:17676 (tcp)
TestAgent_makeNodeID - 2019/12/30 18:57:58.962489 [INFO] agent: Stopping DNS server 127.0.0.1:17676 (udp)
TestAgent_makeNodeID - 2019/12/30 18:57:58.962651 [INFO] agent: Stopping HTTP server 127.0.0.1:17677 (tcp)
TestAgent_makeNodeID - 2019/12/30 18:57:58.962861 [INFO] agent: Waiting for endpoints to shut down
TestAgent_makeNodeID - 2019/12/30 18:57:58.962925 [INFO] agent: Endpoints down
--- PASS: TestAgent_makeNodeID (4.68s)
=== CONT  TestAgent_TokenStore
TestAgent_makeNodeID - 2019/12/30 18:57:58.966837 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_TokenStore - 2019/12/30 18:57:59.031632 [WARN] agent: Node name "Node de68650a-7c89-3f88-60ea-a964e048bed8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_TokenStore - 2019/12/30 18:57:59.032013 [DEBUG] tlsutil: Update with version 1
TestAgent_TokenStore - 2019/12/30 18:57:59.034362 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/30 18:57:59.164079 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/30 18:57:59.164156 [DEBUG] agent: Node info in sync
TestAgent_setupNodeID - 2019/12/30 18:57:59.186824 [INFO] manager: shutting down
TestAgent_setupNodeID - 2019/12/30 18:57:59.188128 [INFO] agent: consul server down
TestAgent_setupNodeID - 2019/12/30 18:57:59.188216 [INFO] agent: shutdown complete
TestAgent_setupNodeID - 2019/12/30 18:57:59.188293 [INFO] agent: Stopping DNS server 127.0.0.1:17682 (tcp)
TestAgent_setupNodeID - 2019/12/30 18:57:59.188474 [INFO] agent: Stopping DNS server 127.0.0.1:17682 (udp)
TestAgent_setupNodeID - 2019/12/30 18:57:59.188677 [INFO] agent: Stopping HTTP server 127.0.0.1:17683 (tcp)
TestAgent_setupNodeID - 2019/12/30 18:57:59.188926 [INFO] agent: Waiting for endpoints to shut down
TestAgent_setupNodeID - 2019/12/30 18:57:59.189011 [INFO] agent: Endpoints down
--- PASS: TestAgent_setupNodeID (3.82s)
=== CONT  TestAgent_RPCPing
TestAgent_setupNodeID - 2019/12/30 18:57:59.189623 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestAgent_setupNodeID - 2019/12/30 18:57:59.189964 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_setupNodeID - 2019/12/30 18:57:59.190155 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_RPCPing - 2019/12/30 18:57:59.250781 [WARN] agent: Node name "Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_RPCPing - 2019/12/30 18:57:59.251232 [DEBUG] tlsutil: Update with version 1
TestAgent_RPCPing - 2019/12/30 18:57:59.253721 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:af0eaeb5-dc9d-ed33-96f4-e526ba42e366 Address:127.0.0.1:17725}]
2019/12/30 18:57:59 [INFO]  raft: Node at 127.0.0.1:17725 [Follower] entering Follower state (Leader: "")
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:59.469786 [INFO] serf: EventMemberJoin: Node af0eaeb5-dc9d-ed33-96f4-e526ba42e366 127.0.0.1
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:59.470523 [INFO] consul: Adding LAN server Node af0eaeb5-dc9d-ed33-96f4-e526ba42e366 (Addr: tcp/127.0.0.1:17725) (DC: dc1)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:59.471227 [INFO] agent: Started DNS server 127.0.0.1:17695 (udp)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:59.471316 [INFO] agent: Started DNS server 127.0.0.1:17695 (tcp)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:59.473877 [INFO] agent: Started HTTP server on 127.0.0.1:17701 (tcp)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:57:59.473998 [INFO] agent: started state syncer
2019/12/30 18:57:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:59 [INFO]  raft: Node at 127.0.0.1:17725 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c2ed98f4-cde8-c4ee-13bf-e08efaad8330 Address:127.0.0.1:17744}]
2019/12/30 18:57:59 [INFO]  raft: Node at 127.0.0.1:17744 [Follower] entering Follower state (Leader: "")
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.783947 [INFO] serf: EventMemberJoin: Node c2ed98f4-cde8-c4ee-13bf-e08efaad8330.dc1 127.0.0.1
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.788237 [INFO] serf: EventMemberJoin: Node c2ed98f4-cde8-c4ee-13bf-e08efaad8330 127.0.0.1
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.788992 [INFO] consul: Adding LAN server Node c2ed98f4-cde8-c4ee-13bf-e08efaad8330 (Addr: tcp/127.0.0.1:17744) (DC: dc1)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.789644 [INFO] consul: Handled member-join event for server "Node c2ed98f4-cde8-c4ee-13bf-e08efaad8330.dc1" in area "wan"
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.789896 [INFO] agent: Started DNS server 127.0.0.1:17731 (udp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.789966 [INFO] agent: Started DNS server 127.0.0.1:17731 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.792392 [INFO] agent: Started HTTP server on 127.0.0.1:17737 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:57:59.792979 [INFO] agent: started state syncer
2019/12/30 18:57:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:59 [INFO]  raft: Node at 127.0.0.1:17744 [Candidate] entering Candidate state in term 2
jones - 2019/12/30 18:58:00.562514 [DEBUG] consul: Skipping self join check for "Node f632792c-c81a-fbfb-b7c4-e99bdb454ade" since the cluster is too small
2019/12/30 18:58:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:00 [INFO]  raft: Node at 127.0.0.1:17725 [Leader] entering Leader state
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:00.563992 [INFO] consul: cluster leadership acquired
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:00.564587 [INFO] consul: New leader elected: Node af0eaeb5-dc9d-ed33-96f4-e526ba42e366
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:00.829783 [INFO] agent: Requesting shutdown
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:00.829895 [INFO] consul: shutting down server
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:00.829944 [WARN] serf: Shutdown without a Leave
2019/12/30 18:58:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:de68650a-7c89-3f88-60ea-a964e048bed8 Address:127.0.0.1:17750}]
TestAgent_TokenStore - 2019/12/30 18:58:00.841312 [INFO] serf: EventMemberJoin: Node de68650a-7c89-3f88-60ea-a964e048bed8.dc1 127.0.0.1
2019/12/30 18:58:00 [INFO]  raft: Node at 127.0.0.1:17750 [Follower] entering Follower state (Leader: "")
TestAgent_TokenStore - 2019/12/30 18:58:00.848312 [INFO] serf: EventMemberJoin: Node de68650a-7c89-3f88-60ea-a964e048bed8 127.0.0.1
TestAgent_TokenStore - 2019/12/30 18:58:00.849716 [INFO] consul: Adding LAN server Node de68650a-7c89-3f88-60ea-a964e048bed8 (Addr: tcp/127.0.0.1:17750) (DC: dc1)
TestAgent_TokenStore - 2019/12/30 18:58:00.849933 [INFO] agent: Started DNS server 127.0.0.1:17745 (udp)
TestAgent_TokenStore - 2019/12/30 18:58:00.849953 [INFO] consul: Handled member-join event for server "Node de68650a-7c89-3f88-60ea-a964e048bed8.dc1" in area "wan"
TestAgent_TokenStore - 2019/12/30 18:58:00.850394 [INFO] agent: Started DNS server 127.0.0.1:17745 (tcp)
TestAgent_TokenStore - 2019/12/30 18:58:00.853166 [INFO] agent: Started HTTP server on 127.0.0.1:17746 (tcp)
TestAgent_TokenStore - 2019/12/30 18:58:00.854350 [INFO] agent: started state syncer
2019/12/30 18:58:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:00 [INFO]  raft: Node at 127.0.0.1:17750 [Candidate] entering Candidate state in term 2
2019/12/30 18:58:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:00 [INFO]  raft: Node at 127.0.0.1:17744 [Leader] entering Leader state
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:00.979566 [INFO] consul: cluster leadership acquired
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:00.980104 [INFO] consul: New leader elected: Node c2ed98f4-cde8-c4ee-13bf-e08efaad8330
2019/12/30 18:58:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0a2b2a46-6cdc-6e09-e813-cbd11f445e7c Address:127.0.0.1:17756}]
2019/12/30 18:58:00 [INFO]  raft: Node at 127.0.0.1:17756 [Follower] entering Follower state (Leader: "")
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:00.994907 [INFO] agent: Requesting shutdown
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:00.995018 [INFO] consul: shutting down server
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:00.995066 [WARN] serf: Shutdown without a Leave
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:00.995468 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_RPCPing - 2019/12/30 18:58:00.984356 [INFO] serf: EventMemberJoin: Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c.dc1 127.0.0.1
TestAgent_RPCPing - 2019/12/30 18:58:01.001704 [INFO] serf: EventMemberJoin: Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c 127.0.0.1
TestAgent_RPCPing - 2019/12/30 18:58:01.003002 [INFO] agent: Started DNS server 127.0.0.1:17751 (udp)
TestAgent_RPCPing - 2019/12/30 18:58:01.004058 [INFO] consul: Adding LAN server Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c (Addr: tcp/127.0.0.1:17756) (DC: dc1)
TestAgent_RPCPing - 2019/12/30 18:58:01.004270 [INFO] consul: Handled member-join event for server "Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c.dc1" in area "wan"
TestAgent_RPCPing - 2019/12/30 18:58:01.004964 [INFO] agent: Started DNS server 127.0.0.1:17751 (tcp)
TestAgent_RPCPing - 2019/12/30 18:58:01.007605 [INFO] agent: Started HTTP server on 127.0.0.1:17752 (tcp)
TestAgent_RPCPing - 2019/12/30 18:58:01.007779 [INFO] agent: started state syncer
2019/12/30 18:58:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:01 [INFO]  raft: Node at 127.0.0.1:17756 [Candidate] entering Candidate state in term 2
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.095102 [WARN] serf: Shutdown without a Leave
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.096953 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.096987 [INFO] agent: consul server down
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.097333 [INFO] agent: shutdown complete
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.097483 [INFO] agent: Stopping DNS server 127.0.0.1:17695 (tcp)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.097744 [INFO] agent: Stopping DNS server 127.0.0.1:17695 (udp)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.098117 [INFO] agent: Stopping HTTP server 127.0.0.1:17701 (tcp)
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.098462 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.098656 [INFO] agent: Endpoints down
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.097075 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_ReconnectConfigWanDisabled - 2019/12/30 18:58:01.099211 [ERR] agent: failed to sync remote state: leadership lost while committing log
=== CONT  TestAgent_StartStop
--- PASS: TestAgent_ReconnectConfigWanDisabled (4.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_StartStop - 2019/12/30 18:58:01.174529 [WARN] agent: Node name "Node dbe003c1-5517-0c0f-238e-abedbba758ee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_StartStop - 2019/12/30 18:58:01.175107 [DEBUG] tlsutil: Update with version 1
TestAgent_StartStop - 2019/12/30 18:58:01.177622 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.186891 [INFO] manager: shutting down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.462224 [INFO] agent: consul server down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.462311 [INFO] agent: shutdown complete
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.462377 [INFO] agent: Stopping DNS server 127.0.0.1:17731 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.462553 [INFO] agent: Stopping DNS server 127.0.0.1:17731 (udp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.462727 [INFO] agent: Stopping HTTP server 127.0.0.1:17737 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.462957 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.463037 [INFO] agent: Endpoints down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.466302 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.538328 [WARN] agent: Node name "Node 83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.538757 [DEBUG] tlsutil: Update with version 1
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:01.543327 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:58:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:01 [INFO]  raft: Node at 127.0.0.1:17756 [Leader] entering Leader state
2019/12/30 18:58:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:01 [INFO]  raft: Node at 127.0.0.1:17750 [Leader] entering Leader state
TestAgent_RPCPing - 2019/12/30 18:58:01.742121 [INFO] consul: cluster leadership acquired
TestAgent_TokenStore - 2019/12/30 18:58:01.742250 [INFO] consul: cluster leadership acquired
TestAgent_RPCPing - 2019/12/30 18:58:01.742573 [INFO] consul: New leader elected: Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c
TestAgent_TokenStore - 2019/12/30 18:58:01.742605 [INFO] consul: New leader elected: Node de68650a-7c89-3f88-60ea-a964e048bed8
TestAgent_TokenStore - 2019/12/30 18:58:01.899773 [INFO] agent: Requesting shutdown
TestAgent_TokenStore - 2019/12/30 18:58:01.899881 [INFO] consul: shutting down server
TestAgent_TokenStore - 2019/12/30 18:58:01.899934 [WARN] serf: Shutdown without a Leave
TestAgent_TokenStore - 2019/12/30 18:58:02.103337 [WARN] serf: Shutdown without a Leave
TestAgent_TokenStore - 2019/12/30 18:58:02.163101 [INFO] manager: shutting down
TestAgent_TokenStore - 2019/12/30 18:58:02.278908 [INFO] agent: consul server down
TestAgent_TokenStore - 2019/12/30 18:58:02.279017 [INFO] agent: shutdown complete
TestAgent_TokenStore - 2019/12/30 18:58:02.279119 [INFO] agent: Stopping DNS server 127.0.0.1:17745 (tcp)
TestAgent_TokenStore - 2019/12/30 18:58:02.279358 [INFO] agent: Stopping DNS server 127.0.0.1:17745 (udp)
TestAgent_TokenStore - 2019/12/30 18:58:02.279758 [INFO] agent: Stopping HTTP server 127.0.0.1:17746 (tcp)
TestAgent_TokenStore - 2019/12/30 18:58:02.280127 [INFO] agent: Waiting for endpoints to shut down
TestAgent_TokenStore - 2019/12/30 18:58:02.280285 [INFO] agent: Endpoints down
--- PASS: TestAgent_TokenStore (3.32s)
=== CONT  TestAgent_HostBadACL
TestAgent_TokenStore - 2019/12/30 18:58:02.292491 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_TokenStore - 2019/12/30 18:58:02.293925 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_TokenStore - 2019/12/30 18:58:02.294254 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_HostBadACL - 2019/12/30 18:58:02.391993 [WARN] agent: Node name "Node 4a21db28-2f80-803e-6945-54fc57da142a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_HostBadACL - 2019/12/30 18:58:02.392719 [DEBUG] tlsutil: Update with version 1
TestAgent_HostBadACL - 2019/12/30 18:58:02.395848 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:58:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:dbe003c1-5517-0c0f-238e-abedbba758ee Address:127.0.0.1:17762}]
TestAgent_StartStop - 2019/12/30 18:58:03.332720 [INFO] serf: EventMemberJoin: Node dbe003c1-5517-0c0f-238e-abedbba758ee.dc1 127.0.0.1
2019/12/30 18:58:03 [INFO]  raft: Node at 127.0.0.1:17762 [Follower] entering Follower state (Leader: "")
TestAgent_RPCPing - 2019/12/30 18:58:03.336325 [INFO] agent: Synced node info
TestAgent_StartStop - 2019/12/30 18:58:03.336842 [INFO] serf: EventMemberJoin: Node dbe003c1-5517-0c0f-238e-abedbba758ee 127.0.0.1
TestAgent_StartStop - 2019/12/30 18:58:03.338418 [INFO] agent: Started DNS server 127.0.0.1:17757 (udp)
TestAgent_StartStop - 2019/12/30 18:58:03.340544 [INFO] consul: Handled member-join event for server "Node dbe003c1-5517-0c0f-238e-abedbba758ee.dc1" in area "wan"
TestAgent_StartStop - 2019/12/30 18:58:03.340552 [INFO] consul: Adding LAN server Node dbe003c1-5517-0c0f-238e-abedbba758ee (Addr: tcp/127.0.0.1:17762) (DC: dc1)
TestAgent_StartStop - 2019/12/30 18:58:03.340840 [INFO] agent: Started DNS server 127.0.0.1:17757 (tcp)
TestAgent_StartStop - 2019/12/30 18:58:03.343524 [INFO] agent: Started HTTP server on 127.0.0.1:17758 (tcp)
TestAgent_StartStop - 2019/12/30 18:58:03.343624 [INFO] agent: started state syncer
2019/12/30 18:58:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:03 [INFO]  raft: Node at 127.0.0.1:17762 [Candidate] entering Candidate state in term 2
2019/12/30 18:58:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c Address:127.0.0.1:17768}]
2019/12/30 18:58:03 [INFO]  raft: Node at 127.0.0.1:17768 [Follower] entering Follower state (Leader: "")
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.644847 [INFO] serf: EventMemberJoin: Node 83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c.dc1 127.0.0.1
2019/12/30 18:58:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:03 [INFO]  raft: Node at 127.0.0.1:17768 [Candidate] entering Candidate state in term 2
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.662317 [INFO] serf: EventMemberJoin: Node 83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c 127.0.0.1
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.666004 [INFO] consul: Adding LAN server Node 83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c (Addr: tcp/127.0.0.1:17768) (DC: dc1)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.668603 [INFO] consul: Handled member-join event for server "Node 83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c.dc1" in area "wan"
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.671090 [INFO] agent: Started DNS server 127.0.0.1:17763 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.672022 [INFO] agent: Started DNS server 127.0.0.1:17763 (udp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.675043 [INFO] agent: Started HTTP server on 127.0.0.1:17764 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:03.675153 [INFO] agent: started state syncer
2019/12/30 18:58:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:04 [INFO]  raft: Node at 127.0.0.1:17762 [Leader] entering Leader state
TestAgent_StartStop - 2019/12/30 18:58:04.014208 [INFO] consul: cluster leadership acquired
TestAgent_StartStop - 2019/12/30 18:58:04.014757 [INFO] consul: New leader elected: Node dbe003c1-5517-0c0f-238e-abedbba758ee
2019/12/30 18:58:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4a21db28-2f80-803e-6945-54fc57da142a Address:127.0.0.1:17774}]
2019/12/30 18:58:04 [INFO]  raft: Node at 127.0.0.1:17774 [Follower] entering Follower state (Leader: "")
TestAgent_HostBadACL - 2019/12/30 18:58:04.274702 [INFO] serf: EventMemberJoin: Node 4a21db28-2f80-803e-6945-54fc57da142a.dc1 127.0.0.1
TestAgent_HostBadACL - 2019/12/30 18:58:04.278204 [INFO] serf: EventMemberJoin: Node 4a21db28-2f80-803e-6945-54fc57da142a 127.0.0.1
TestAgent_HostBadACL - 2019/12/30 18:58:04.278769 [INFO] consul: Adding LAN server Node 4a21db28-2f80-803e-6945-54fc57da142a (Addr: tcp/127.0.0.1:17774) (DC: dc1)
TestAgent_HostBadACL - 2019/12/30 18:58:04.279043 [INFO] consul: Handled member-join event for server "Node 4a21db28-2f80-803e-6945-54fc57da142a.dc1" in area "wan"
TestAgent_HostBadACL - 2019/12/30 18:58:04.279332 [INFO] agent: Started DNS server 127.0.0.1:17769 (tcp)
TestAgent_HostBadACL - 2019/12/30 18:58:04.279494 [INFO] agent: Started DNS server 127.0.0.1:17769 (udp)
TestAgent_HostBadACL - 2019/12/30 18:58:04.281905 [INFO] agent: Started HTTP server on 127.0.0.1:17770 (tcp)
TestAgent_HostBadACL - 2019/12/30 18:58:04.282014 [INFO] agent: started state syncer
TestAgent_StartStop - 2019/12/30 18:58:04.290025 [INFO] consul: server starting leave
2019/12/30 18:58:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:04 [INFO]  raft: Node at 127.0.0.1:17774 [Candidate] entering Candidate state in term 2
2019/12/30 18:58:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:04 [INFO]  raft: Node at 127.0.0.1:17768 [Leader] entering Leader state
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.399162 [INFO] consul: cluster leadership acquired
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.399720 [INFO] consul: New leader elected: Node 83ab5a5a-8f73-b6bc-2628-f1a7aa37a72c
TestAgent_StartStop - 2019/12/30 18:58:04.538009 [INFO] agent: Synced node info
TestAgent_StartStop - 2019/12/30 18:58:04.538191 [DEBUG] agent: Node info in sync
TestAgent_StartStop - 2019/12/30 18:58:04.538510 [INFO] serf: EventMemberLeave: Node dbe003c1-5517-0c0f-238e-abedbba758ee.dc1 127.0.0.1
TestAgent_StartStop - 2019/12/30 18:58:04.538764 [INFO] consul: Handled member-leave event for server "Node dbe003c1-5517-0c0f-238e-abedbba758ee.dc1" in area "wan"
TestAgent_StartStop - 2019/12/30 18:58:04.538826 [INFO] manager: shutting down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.693105 [INFO] agent: Requesting shutdown
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.693227 [INFO] consul: shutting down server
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.693278 [WARN] serf: Shutdown without a Leave
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.782480 [WARN] serf: Shutdown without a Leave
TestAgent_StartStop - 2019/12/30 18:58:04.810440 [DEBUG] agent: Node info in sync
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.870278 [INFO] manager: shutting down
TestAgent_RPCPing - 2019/12/30 18:58:04.877739 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_RPCPing - 2019/12/30 18:58:04.878269 [DEBUG] consul: Skipping self join check for "Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c" since the cluster is too small
TestAgent_RPCPing - 2019/12/30 18:58:04.878461 [INFO] consul: member 'Node 0a2b2a46-6cdc-6e09-e813-cbd11f445e7c' joined, marking health alive
TestAgent_RPCPing - 2019/12/30 18:58:04.910963 [DEBUG] agent: Node info in sync
TestAgent_RPCPing - 2019/12/30 18:58:04.911104 [DEBUG] agent: Node info in sync
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.954566 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.954746 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.954808 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.954839 [INFO] agent: consul server down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.954883 [INFO] agent: shutdown complete
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.954936 [INFO] agent: Stopping DNS server 127.0.0.1:17763 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.955095 [INFO] agent: Stopping DNS server 127.0.0.1:17763 (udp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.955391 [INFO] agent: Stopping HTTP server 127.0.0.1:17764 (tcp)
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.955701 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ReconnectConfigSettings - 2019/12/30 18:58:04.955793 [INFO] agent: Endpoints down
--- PASS: TestAgent_ReconnectConfigSettings (7.10s)
=== CONT  TestAgent_Host
2019/12/30 18:58:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:05 [INFO]  raft: Node at 127.0.0.1:17774 [Leader] entering Leader state
TestAgent_HostBadACL - 2019/12/30 18:58:05.048445 [INFO] consul: cluster leadership acquired
TestAgent_HostBadACL - 2019/12/30 18:58:05.048898 [INFO] consul: New leader elected: Node 4a21db28-2f80-803e-6945-54fc57da142a
TestAgent_RPCPing - 2019/12/30 18:58:05.066557 [INFO] agent: Requesting shutdown
TestAgent_RPCPing - 2019/12/30 18:58:05.066685 [INFO] consul: shutting down server
TestAgent_RPCPing - 2019/12/30 18:58:05.066742 [WARN] serf: Shutdown without a Leave
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Host - 2019/12/30 18:58:05.161243 [WARN] agent: Node name "Node bcc52d2b-f28d-f2d0-081b-1407d0cec975" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Host - 2019/12/30 18:58:05.162142 [DEBUG] tlsutil: Update with version 1
TestAgent_Host - 2019/12/30 18:58:05.165627 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_RPCPing - 2019/12/30 18:58:05.244239 [WARN] serf: Shutdown without a Leave
TestAgent_HostBadACL - 2019/12/30 18:58:05.350699 [ERR] agent: failed to sync remote state: ACL not found
TestAgent_RPCPing - 2019/12/30 18:58:05.406764 [INFO] manager: shutting down
TestAgent_RPCPing - 2019/12/30 18:58:05.407233 [INFO] agent: consul server down
TestAgent_RPCPing - 2019/12/30 18:58:05.407289 [INFO] agent: shutdown complete
TestAgent_RPCPing - 2019/12/30 18:58:05.407347 [INFO] agent: Stopping DNS server 127.0.0.1:17751 (tcp)
TestAgent_RPCPing - 2019/12/30 18:58:05.407508 [INFO] agent: Stopping DNS server 127.0.0.1:17751 (udp)
TestAgent_RPCPing - 2019/12/30 18:58:05.407668 [INFO] agent: Stopping HTTP server 127.0.0.1:17752 (tcp)
TestAgent_RPCPing - 2019/12/30 18:58:05.407872 [INFO] agent: Waiting for endpoints to shut down
TestAgent_RPCPing - 2019/12/30 18:58:05.407940 [INFO] agent: Endpoints down
--- PASS: TestAgent_RPCPing (6.22s)
=== CONT  TestAgentConnectAuthorize_defaultAllow
TestAgent_HostBadACL - 2019/12/30 18:58:05.504917 [INFO] acl: initializing acls
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:05.568231 [WARN] agent: Node name "Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:05.568858 [DEBUG] tlsutil: Update with version 1
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:05.571902 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_HostBadACL - 2019/12/30 18:58:05.754666 [INFO] consul: Created ACL 'global-management' policy
TestAgent_HostBadACL - 2019/12/30 18:58:05.754764 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_HostBadACL - 2019/12/30 18:58:05.831089 [INFO] acl: initializing acls
TestAgent_HostBadACL - 2019/12/30 18:58:05.831532 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_StartStop - 2019/12/30 18:58:05.913403 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_HostBadACL - 2019/12/30 18:58:05.913660 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_StartStop - 2019/12/30 18:58:05.913879 [DEBUG] consul: Skipping self join check for "Node dbe003c1-5517-0c0f-238e-abedbba758ee" since the cluster is too small
TestAgent_StartStop - 2019/12/30 18:58:05.914067 [INFO] consul: member 'Node dbe003c1-5517-0c0f-238e-abedbba758ee' joined, marking health alive
2019/12/30 18:58:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bcc52d2b-f28d-f2d0-081b-1407d0cec975 Address:127.0.0.1:17780}]
2019/12/30 18:58:06 [INFO]  raft: Node at 127.0.0.1:17780 [Follower] entering Follower state (Leader: "")
TestAgent_Host - 2019/12/30 18:58:06.349646 [INFO] serf: EventMemberJoin: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975.dc1 127.0.0.1
TestAgent_Host - 2019/12/30 18:58:06.352663 [INFO] serf: EventMemberJoin: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975 127.0.0.1
TestAgent_Host - 2019/12/30 18:58:06.353958 [INFO] agent: Started DNS server 127.0.0.1:17775 (udp)
TestAgent_Host - 2019/12/30 18:58:06.355722 [INFO] consul: Adding LAN server Node bcc52d2b-f28d-f2d0-081b-1407d0cec975 (Addr: tcp/127.0.0.1:17780) (DC: dc1)
TestAgent_Host - 2019/12/30 18:58:06.356104 [INFO] consul: Handled member-join event for server "Node bcc52d2b-f28d-f2d0-081b-1407d0cec975.dc1" in area "wan"
TestAgent_Host - 2019/12/30 18:58:06.356911 [INFO] agent: Started DNS server 127.0.0.1:17775 (tcp)
TestAgent_Host - 2019/12/30 18:58:06.359362 [INFO] agent: Started HTTP server on 127.0.0.1:17776 (tcp)
TestAgent_Host - 2019/12/30 18:58:06.362613 [INFO] agent: started state syncer
2019/12/30 18:58:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:06 [INFO]  raft: Node at 127.0.0.1:17780 [Candidate] entering Candidate state in term 2
TestAgent_HostBadACL - 2019/12/30 18:58:06.463227 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_HostBadACL - 2019/12/30 18:58:06.464462 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_HostBadACL - 2019/12/30 18:58:06.465432 [INFO] serf: EventMemberUpdate: Node 4a21db28-2f80-803e-6945-54fc57da142a
TestAgent_HostBadACL - 2019/12/30 18:58:06.466137 [INFO] serf: EventMemberUpdate: Node 4a21db28-2f80-803e-6945-54fc57da142a.dc1
2019/12/30 18:58:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00 Address:127.0.0.1:17786}]
2019/12/30 18:58:06 [INFO]  raft: Node at 127.0.0.1:17786 [Follower] entering Follower state (Leader: "")
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.632848 [INFO] serf: EventMemberJoin: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00.dc1 127.0.0.1
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.636360 [INFO] serf: EventMemberJoin: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00 127.0.0.1
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.636990 [INFO] consul: Adding LAN server Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00 (Addr: tcp/127.0.0.1:17786) (DC: dc1)
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.637194 [INFO] consul: Handled member-join event for server "Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00.dc1" in area "wan"
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.637602 [INFO] agent: Started DNS server 127.0.0.1:17781 (udp)
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.637686 [INFO] agent: Started DNS server 127.0.0.1:17781 (tcp)
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.641368 [INFO] agent: Started HTTP server on 127.0.0.1:17782 (tcp)
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:06.641509 [INFO] agent: started state syncer
2019/12/30 18:58:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:06 [INFO]  raft: Node at 127.0.0.1:17786 [Candidate] entering Candidate state in term 2
TestAgent_HostBadACL - 2019/12/30 18:58:06.749019 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_HostBadACL - 2019/12/30 18:58:06.749105 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgent_HostBadACL - 2019/12/30 18:58:06.749961 [INFO] serf: EventMemberUpdate: Node 4a21db28-2f80-803e-6945-54fc57da142a
TestAgent_HostBadACL - 2019/12/30 18:58:06.750588 [INFO] serf: EventMemberUpdate: Node 4a21db28-2f80-803e-6945-54fc57da142a.dc1
TestAgent_StartStop - 2019/12/30 18:58:06.872203 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:58:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:07 [INFO]  raft: Node at 127.0.0.1:17780 [Leader] entering Leader state
TestAgent_Host - 2019/12/30 18:58:07.168259 [INFO] consul: cluster leadership acquired
TestAgent_Host - 2019/12/30 18:58:07.168891 [INFO] consul: New leader elected: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975
TestAgent_Host - 2019/12/30 18:58:07.388435 [ERR] agent: failed to sync remote state: ACL not found
2019/12/30 18:58:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:07 [INFO]  raft: Node at 127.0.0.1:17786 [Leader] entering Leader state
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:07.401595 [INFO] consul: cluster leadership acquired
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:07.402053 [INFO] consul: New leader elected: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:07.538393 [ERR] agent: failed to sync remote state: ACL not found
TestAgent_StartStop - 2019/12/30 18:58:07.538946 [INFO] serf: EventMemberLeave: Node dbe003c1-5517-0c0f-238e-abedbba758ee 127.0.0.1
TestAgent_StartStop - 2019/12/30 18:58:07.539257 [INFO] consul: Removing LAN server Node dbe003c1-5517-0c0f-238e-abedbba758ee (Addr: tcp/127.0.0.1:17762) (DC: dc1)
TestAgent_StartStop - 2019/12/30 18:58:07.539533 [WARN] consul: deregistering self (Node dbe003c1-5517-0c0f-238e-abedbba758ee) should be done by follower
TestAgent_Host - 2019/12/30 18:58:07.565147 [INFO] acl: initializing acls
TestAgent_HostBadACL - 2019/12/30 18:58:07.627240 [ERR] agent: failed to sync remote state: ACL not found
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:07.688188 [ERR] agent: failed to sync remote state: ACL not found
TestAgent_Host - 2019/12/30 18:58:07.796107 [INFO] consul: Created ACL 'global-management' policy
TestAgent_Host - 2019/12/30 18:58:07.796209 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:07.870687 [INFO] acl: initializing acls
TestAgent_HostBadACL - 2019/12/30 18:58:07.871247 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_HostBadACL - 2019/12/30 18:58:07.871799 [DEBUG] consul: Skipping self join check for "Node 4a21db28-2f80-803e-6945-54fc57da142a" since the cluster is too small
TestAgent_HostBadACL - 2019/12/30 18:58:07.871971 [INFO] consul: member 'Node 4a21db28-2f80-803e-6945-54fc57da142a' joined, marking health alive
TestAgent_Host - 2019/12/30 18:58:07.907929 [INFO] acl: initializing acls
TestAgent_Host - 2019/12/30 18:58:07.908068 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_Host - 2019/12/30 18:58:07.963191 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.029376 [INFO] consul: Created ACL 'global-management' policy
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.029556 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_HostBadACL - 2019/12/30 18:58:08.031382 [DEBUG] consul: Skipping self join check for "Node 4a21db28-2f80-803e-6945-54fc57da142a" since the cluster is too small
TestAgent_HostBadACL - 2019/12/30 18:58:08.032075 [DEBUG] consul: Skipping self join check for "Node 4a21db28-2f80-803e-6945-54fc57da142a" since the cluster is too small
TestAgent_HostBadACL - 2019/12/30 18:58:08.047164 [DEBUG] consul: dropping node "Node 4a21db28-2f80-803e-6945-54fc57da142a" from result due to ACLs
TestAgent_HostBadACL - 2019/12/30 18:58:08.047623 [INFO] agent: Requesting shutdown
TestAgent_HostBadACL - 2019/12/30 18:58:08.047701 [INFO] consul: shutting down server
TestAgent_HostBadACL - 2019/12/30 18:58:08.047762 [WARN] serf: Shutdown without a Leave
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.195787 [INFO] acl: initializing acls
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.195939 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgent_HostBadACL - 2019/12/30 18:58:08.220227 [WARN] serf: Shutdown without a Leave
TestAgent_HostBadACL - 2019/12/30 18:58:08.295240 [INFO] manager: shutting down
TestAgent_HostBadACL - 2019/12/30 18:58:08.295668 [INFO] agent: consul server down
TestAgent_HostBadACL - 2019/12/30 18:58:08.295718 [INFO] agent: shutdown complete
TestAgent_HostBadACL - 2019/12/30 18:58:08.295768 [INFO] agent: Stopping DNS server 127.0.0.1:17769 (tcp)
TestAgent_HostBadACL - 2019/12/30 18:58:08.295910 [INFO] agent: Stopping DNS server 127.0.0.1:17769 (udp)
TestAgent_HostBadACL - 2019/12/30 18:58:08.296064 [INFO] agent: Stopping HTTP server 127.0.0.1:17770 (tcp)
TestAgent_HostBadACL - 2019/12/30 18:58:08.296281 [INFO] agent: Waiting for endpoints to shut down
TestAgent_HostBadACL - 2019/12/30 18:58:08.296357 [INFO] agent: Endpoints down
--- PASS: TestAgent_HostBadACL (6.02s)
=== CONT  TestAgentConnectAuthorize_defaultDeny
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.300501 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_Host - 2019/12/30 18:58:08.472516 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_Host - 2019/12/30 18:58:08.472975 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_Host - 2019/12/30 18:58:08.474763 [INFO] serf: EventMemberUpdate: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975
TestAgent_Host - 2019/12/30 18:58:08.475595 [INFO] serf: EventMemberUpdate: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975.dc1
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:08.519051 [WARN] agent: Node name "Node 007dd81f-041f-b405-aa95-19e18c937ce6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:08.519756 [DEBUG] tlsutil: Update with version 1
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:08.531818 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Host - 2019/12/30 18:58:08.723239 [INFO] consul: Created ACL anonymous token from configuration
TestAgent_Host - 2019/12/30 18:58:08.723325 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgent_Host - 2019/12/30 18:58:08.724154 [INFO] serf: EventMemberUpdate: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975
TestAgent_Host - 2019/12/30 18:58:08.724924 [INFO] serf: EventMemberUpdate: Node bcc52d2b-f28d-f2d0-081b-1407d0cec975.dc1
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.787504 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.788433 [INFO] consul: Created ACL anonymous token from configuration
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.789513 [INFO] serf: EventMemberUpdate: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:08.790156 [INFO] serf: EventMemberUpdate: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00.dc1
TestAgent_StartStop - 2019/12/30 18:58:08.871687 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:09.081208 [INFO] consul: Created ACL anonymous token from configuration
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:09.081308 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:09.082326 [INFO] serf: EventMemberUpdate: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:09.083038 [INFO] serf: EventMemberUpdate: Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00.dc1
2019/12/30 18:58:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:007dd81f-041f-b405-aa95-19e18c937ce6 Address:127.0.0.1:17792}]
2019/12/30 18:58:09 [INFO]  raft: Node at 127.0.0.1:17792 [Follower] entering Follower state (Leader: "")
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.437746 [INFO] serf: EventMemberJoin: Node 007dd81f-041f-b405-aa95-19e18c937ce6.dc1 127.0.0.1
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.441464 [INFO] serf: EventMemberJoin: Node 007dd81f-041f-b405-aa95-19e18c937ce6 127.0.0.1
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.442054 [INFO] consul: Adding LAN server Node 007dd81f-041f-b405-aa95-19e18c937ce6 (Addr: tcp/127.0.0.1:17792) (DC: dc1)
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.442253 [INFO] consul: Handled member-join event for server "Node 007dd81f-041f-b405-aa95-19e18c937ce6.dc1" in area "wan"
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.442762 [INFO] agent: Started DNS server 127.0.0.1:17787 (udp)
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.442833 [INFO] agent: Started DNS server 127.0.0.1:17787 (tcp)
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.445722 [INFO] agent: Started HTTP server on 127.0.0.1:17788 (tcp)
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:09.445855 [INFO] agent: started state syncer
2019/12/30 18:58:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:09 [INFO]  raft: Node at 127.0.0.1:17792 [Candidate] entering Candidate state in term 2
TestAgent_Host - 2019/12/30 18:58:09.713626 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_Host - 2019/12/30 18:58:09.714170 [DEBUG] consul: Skipping self join check for "Node bcc52d2b-f28d-f2d0-081b-1407d0cec975" since the cluster is too small
TestAgent_Host - 2019/12/30 18:58:09.714278 [INFO] consul: member 'Node bcc52d2b-f28d-f2d0-081b-1407d0cec975' joined, marking health alive
TestAgent_Host - 2019/12/30 18:58:09.780820 [ERR] agent: failed to sync remote state: ACL not found
TestAgent_Host - 2019/12/30 18:58:09.984108 [DEBUG] consul: Skipping self join check for "Node bcc52d2b-f28d-f2d0-081b-1407d0cec975" since the cluster is too small
TestAgent_Host - 2019/12/30 18:58:09.984778 [DEBUG] consul: Skipping self join check for "Node bcc52d2b-f28d-f2d0-081b-1407d0cec975" since the cluster is too small
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.055468 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.056323 [DEBUG] consul: Skipping self join check for "Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00" since the cluster is too small
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.056449 [INFO] consul: member 'Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00' joined, marking health alive
TestAgent_Host - 2019/12/30 18:58:10.106486 [INFO] agent: Requesting shutdown
TestAgent_Host - 2019/12/30 18:58:10.106598 [INFO] consul: shutting down server
TestAgent_Host - 2019/12/30 18:58:10.106670 [WARN] serf: Shutdown without a Leave
2019/12/30 18:58:10 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:10 [INFO]  raft: Node at 127.0.0.1:17792 [Leader] entering Leader state
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:10.145685 [INFO] consul: cluster leadership acquired
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:10.146170 [INFO] consul: New leader elected: Node 007dd81f-041f-b405-aa95-19e18c937ce6
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:10.148417 [ERR] agent: failed to sync remote state: ACL not found
TestAgentConnectAuthorize_defaultDeny - 2019/12/30 18:58:10.194512 [INFO] acl: initializing acls
TestAgent_Host - 2019/12/30 18:58:10.212048 [WARN] serf: Shutdown without a Leave
TestAgent_Host - 2019/12/30 18:58:10.303724 [INFO] manager: shutting down
TestAgent_Host - 2019/12/30 18:58:10.304271 [INFO] agent: consul server down
TestAgent_Host - 2019/12/30 18:58:10.304327 [INFO] agent: shutdown complete
TestAgent_Host - 2019/12/30 18:58:10.304438 [INFO] agent: Stopping DNS server 127.0.0.1:17775 (tcp)
TestAgent_Host - 2019/12/30 18:58:10.304608 [INFO] agent: Stopping DNS server 127.0.0.1:17775 (udp)
TestAgent_Host - 2019/12/30 18:58:10.304798 [INFO] agent: Stopping HTTP server 127.0.0.1:17776 (tcp)
TestAgent_Host - 2019/12/30 18:58:10.305057 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Host - 2019/12/30 18:58:10.305136 [INFO] agent: Endpoints down
--- PASS: TestAgent_Host (5.35s)
=== CONT  TestAgentConnectAuthorize_serviceWrite
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.306508 [DEBUG] consul: Skipping self join check for "Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00" since the cluster is too small
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.306998 [DEBUG] consul: Skipping self join check for "Node bb3c4457-2f7e-ecb9-9c3d-3db6569ccc00" since the cluster is too small
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.324535 [INFO] agent: Requesting shutdown
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.324653 [INFO] consul: shutting down server
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.324703 [WARN] serf: Shutdown without a Leave
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentConnectAuthorize_serviceWrite - 2019/12/30 18:58:10.472540 [WARN] agent: Node name "Node 753ab56f-d704-4364-28b5-900fe2e10b80" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentConnectAuthorize_serviceWrite - 2019/12/30 18:58:10.473034 [DEBUG] tlsutil: Update with version 1
TestAgentConnectAuthorize_serviceWrite - 2019/12/30 18:58:10.476288 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentConnectAuthorize_defaultAllow - 2019/12/30 18:58:10.487129 [WARN] serf: Shutdown without a Leave
TestAgent_StartStop - 2019/12/30 18:58:10.544898 [INFO] consul: Waiting 5s to drain RPC traffic
panic: test timed out after 7m0s

goroutine 34057 [running]:
testing.(*M).startAlarm.func1()
	/usr/lib/go-1.13/src/testing/testing.go:1377 +0xbc
created by time.goFunc
	/usr/lib/go-1.13/src/time/sleep.go:168 +0x34

goroutine 1 [chan receive, 4 minutes]:
testing.tRunner.func1(0x48ce000)
	/usr/lib/go-1.13/src/testing/testing.go:885 +0x1b4
testing.tRunner(0x48ce000, 0x4b43ed0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
testing.runTests(0x4bc6100, 0x32b2ce0, 0x1f6, 0x1f6, 0x0)
	/usr/lib/go-1.13/src/testing/testing.go:1200 +0x238
testing.(*M).Run(0x4886980, 0x0)
	/usr/lib/go-1.13/src/testing/testing.go:1117 +0x13c
main.main()
	_testmain.go:1046 +0x120

goroutine 6 [syscall, 7 minutes]:
os/signal.signal_recv(0x0)
	/usr/lib/go-1.13/src/runtime/sigqueue.go:147 +0x130
os/signal.loop()
	/usr/lib/go-1.13/src/os/signal/signal_unix.go:23 +0x14
created by os/signal.init.0
	/usr/lib/go-1.13/src/os/signal/signal_unix.go:29 +0x30

goroutine 8 [chan receive]:
github.com/golang/glog.(*loggingT).flushDaemon(0x32b9788)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/golang/glog/glog.go:882 +0x70
created by github.com/golang/glog.init.0
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/golang/glog/glog.go:410 +0x214

goroutine 14 [select]:
go.opencensus.io/stats/view.(*worker).start(0x48861c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/go.opencensus.io/stats/view/worker.go:154 +0xb0
created by go.opencensus.io/stats/view.init.0
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/go.opencensus.io/stats/view/worker.go:32 +0x48

goroutine 29942 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x89d6fc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e4240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e4240, 0x8628380)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x72d8b70, 0x65e4240, 0x8628380)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 16 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48ce140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Update(0x48ce140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:73 +0x20
testing.tRunner(0x48ce140, 0x1d9cac4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 24 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_UpdateUpsert(0x4bea140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:103 +0x20
testing.tRunner(0x4bea140, 0x1d9cac0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 25 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea1e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Destroy(0x4bea1e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:132 +0x20
testing.tRunner(0x4bea1e0, 0x1d9caa8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 26 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Clone(0x4bea280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:164 +0x20
testing.tRunner(0x4bea280, 0x1d9caa4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 27 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Get(0x4bea320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:208 +0x1c
testing.tRunner(0x4bea320, 0x1d9cab8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 29 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACLReplicationStatus(0x4bea460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:283 +0x1c
testing.tRunner(0x4bea460, 0x1d9ca8c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 30 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Disabled_Response(0x4bea500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:23 +0x20
testing.tRunner(0x4bea500, 0x1d9ca9c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 31 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea5a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Bootstrap(0x4bea5a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:79 +0x20
testing.tRunner(0x4bea5a0, 0x1d9ca98)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 32 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_HTTP(0x4bea640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:127 +0x1c
testing.tRunner(0x4bea640, 0x1d9caa0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 33 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea6e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_LoginProcedure_HTTP(0x4bea6e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:1081 +0x20
testing.tRunner(0x4bea6e0, 0x1d9cac8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 66 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Version8(0x4bea780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:151 +0x1c
testing.tRunner(0x4bea780, 0x1d9cad8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 67 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_AgentMasterToken(0x4bea820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:185 +0x1c
testing.tRunner(0x4bea820, 0x1d9ca94)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 68 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea8c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_RootAuthorizersDenied(0x4bea8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:205 +0x1c
testing.tRunner(0x4bea8c0, 0x1d9cacc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 69 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bea960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetServiceRegister(0x4bea960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:272 +0x20
testing.tRunner(0x4bea960, 0x1d9caf0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 70 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beaa00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetServiceUpdate(0x4beaa00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:303 +0x1c
testing.tRunner(0x4beaa00, 0x1d9caf4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 71 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beaaa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetCheckRegister(0x4beaaa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:326 +0x20
testing.tRunner(0x4beaaa0, 0x1d9cae8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 72 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beab40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetCheckUpdate(0x4beab40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:392 +0x1c
testing.tRunner(0x4beab40, 0x1d9caec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 73 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beabe0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_filterMembers(0x4beabe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:432 +0x1c
testing.tRunner(0x4beabe0, 0x1d9cae0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 74 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beac80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_filterServices(0x4beac80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:451 +0x1c
testing.tRunner(0x4beac80, 0x1d9cae4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 75 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4bead20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_filterChecks(0x4bead20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/acl_test.go:465 +0x1c
testing.tRunner(0x4bead20, 0x1d9cadc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 76 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beadc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services(0x4beadc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:57 +0x20
testing.tRunner(0x4beadc0, 0x1d9cd7c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 77 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beae60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServicesFiltered(0x4beae60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:105 +0x1c
testing.tRunner(0x4beae60, 0x1d9cd6c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 78 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beaf00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services_ExternalConnectProxy(0x4beaf00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:149 +0x20
testing.tRunner(0x4beaf00, 0x1d9cd74)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 79 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beafa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services_Sidecar(0x4beafa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:185 +0x20
testing.tRunner(0x4beafa0, 0x1d9cd78)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 80 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services_ACLFilter(0x4beb040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:235 +0x1c
testing.tRunner(0x4beb040, 0x1d9cd70)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 82 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Service_DeprecatedManagedProxy(0x4beb180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:571 +0x20
testing.tRunner(0x4beb180, 0x1d9cd5c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 83 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Checks(0x4beb220)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:648 +0x1c
testing.tRunner(0x4beb220, 0x1d9cbbc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 84 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb2c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ChecksWithFilter(0x4beb2c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:676 +0x1c
testing.tRunner(0x4beb2c0, 0x1d9cbb4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 85 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_HealthServiceByID(0x4beb360)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:707 +0x1c
testing.tRunner(0x4beb360, 0x1d9cc00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 86 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_HealthServiceByName(0x4beb400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:902 +0x1c
testing.tRunner(0x4beb400, 0x1d9cc04)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 87 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb4a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Checks_ACLFilter(0x4beb4a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1145 +0x1c
testing.tRunner(0x4beb4a0, 0x1d9cbb8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 88 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4beb540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Self(0x4beb540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1184 +0x20
testing.tRunner(0x4beb540, 0x1d9cd48)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 35 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4936000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Self_ACLDeny(0x4936000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1222 +0x1c
testing.tRunner(0x4936000, 0x1d9cd44)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 51 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Metrics_ACLDeny(0x4982000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1251 +0x1c
testing.tRunner(0x4982000, 0x1d9cc54)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 52 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49820a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Reload(0x49820a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1280 +0x20
testing.tRunner(0x49820a0, 0x1d9cd28)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 53 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Reload_ACLDeny(0x4982140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1353 +0x1c
testing.tRunner(0x4982140, 0x1d9cd24)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 54 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49821e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Members(0x49821e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1380 +0x1c
testing.tRunner(0x49821e0, 0x1d9cc50)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 55 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Members_WAN(0x4982280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1401 +0x1c
testing.tRunner(0x4982280, 0x1d9cc4c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 56 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Members_ACLFilter(0x4982320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1422 +0x1c
testing.tRunner(0x4982320, 0x1d9cc48)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 57 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49823c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Join(0x49823c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1453 +0x20
testing.tRunner(0x49823c0, 0x1d9cc34)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 58 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Join_WAN(0x4982460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1483 +0x20
testing.tRunner(0x4982460, 0x1d9cc30)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 59 [chan send]:
testing.tRunner.func1(0x4982500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4982500, 0x1d9cc2c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 60 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49825a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_JoinLANNotify(0x49825a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1555 +0x20
testing.tRunner(0x49825a0, 0x1d9cc28)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 62 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49826e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Leave_ACLDeny(0x49826e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1618 +0x1c
testing.tRunner(0x49826e0, 0x1d9cc38)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 64 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ForceLeave_ACLDeny(0x4982820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1693 +0x1c
testing.tRunner(0x4982820, 0x1d9cbe4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 65 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49828c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck(0x49828c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1722 +0x1c
testing.tRunner(0x49828c0, 0x1d9ccd8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 98 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_Scripts(0x4982960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1765 +0x20
testing.tRunner(0x4982960, 0x1d9ccd0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 99 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheckScriptsExecDisable(0x4982a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1850 +0x20
testing.tRunner(0x4982a00, 0x1d9ccbc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 100 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheckScriptsExecRemoteDisable(0x4982aa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1876 +0x20
testing.tRunner(0x4982aa0, 0x1d9ccc0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 101 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_Passing(0x4982b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1904 +0x1c
testing.tRunner(0x4982b40, 0x1d9cccc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 102 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_BadStatus(0x4982be0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1940 +0x1c
testing.tRunner(0x4982be0, 0x1d9ccc8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 103 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_ACLDeny(0x4982c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1961 +0x20
testing.tRunner(0x4982c80, 0x1d9ccc4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 104 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterCheck(0x4982d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2086 +0x1c
testing.tRunner(0x4982d20, 0x1d9cbc8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 105 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterCheckACLDeny(0x4982dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2112 +0x1c
testing.tRunner(0x4982dc0, 0x1d9cbc4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 106 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PassCheck(0x4982e60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2138 +0x1c
testing.tRunner(0x4982e60, 0x1d9cc84)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 107 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PassCheck_ACLDeny(0x4982f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2166 +0x1c
testing.tRunner(0x4982f00, 0x1d9cc80)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 108 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4982fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_WarnCheck(0x4982fa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2193 +0x1c
testing.tRunner(0x4982fa0, 0x1d9cda4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 109 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_WarnCheck_ACLDeny(0x4983040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2221 +0x1c
testing.tRunner(0x4983040, 0x1d9cda0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 110 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49830e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_FailCheck(0x49830e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2248 +0x1c
testing.tRunner(0x49830e0, 0x1d9cbe0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 111 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_FailCheck_ACLDeny(0x4983180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2276 +0x1c
testing.tRunner(0x4983180, 0x1d9cbdc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 112 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_UpdateCheck(0x4983220)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2303 +0x20
testing.tRunner(0x4983220, 0x1d9cd9c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 113 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49832c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_UpdateCheck_ACLDeny(0x49832c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2388 +0x1c
testing.tRunner(0x49832c0, 0x1d9cd98)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 114 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService(0x4983360)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2417 +0x1c
testing.tRunner(0x4983360, 0x1d9cd14)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 115 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_TranslateKeys(0x4983400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2484 +0x1c
testing.tRunner(0x4983400, 0x1d9cd08)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 116 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49834a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ACLDeny(0x49834a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2708 +0x1c
testing.tRunner(0x49834a0, 0x1d9cce8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 117 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_InvalidAddress(0x4983540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2746 +0x1c
testing.tRunner(0x4983540, 0x1d9ccf0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 118 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49835e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ManagedConnectProxy(0x49835e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2776 +0x20
testing.tRunner(0x49835e0, 0x1d9ccfc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 119 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ManagedConnectProxyDeprecated(0x4983680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2847 +0x20
testing.tRunner(0x4983680, 0x1d9ccf4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 120 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ManagedConnectProxy_Disabled(0x4983720)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2945 +0x20
testing.tRunner(0x4983720, 0x1d9ccf8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 121 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49837c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_UnmanagedConnectProxy(0x49837c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2984 +0x20
testing.tRunner(0x49837c0, 0x1d9cd10)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 123 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_UnmanagedConnectProxyInvalid(0x4983900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3589 +0x20
testing.tRunner(0x4983900, 0x1d9cd0c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 124 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49839a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ConnectNative(0x49839a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3622 +0x20
testing.tRunner(0x49839a0, 0x1d9ccec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 125 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ScriptCheck_ExecDisable(0x4983a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3656 +0x1c
testing.tRunner(0x4983a40, 0x1d9cd00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 126 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable(0x4983ae0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3692 +0x1c
testing.tRunner(0x4983ae0, 0x1d9cd04)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 127 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService(0x4983b80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3730 +0x1c
testing.tRunner(0x4983b80, 0x1d9cbd8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 128 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983c20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService_ACLDeny(0x4983c20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3763 +0x1c
testing.tRunner(0x4983c20, 0x1d9cbcc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 129 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService_withManagedProxy(0x4983cc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3792 +0x20
testing.tRunner(0x4983cc0, 0x1d9cbd4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 130 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService_managedProxyDirect(0x4983d60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3848 +0x20
testing.tRunner(0x4983d60, 0x1d9cbd0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 131 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServiceMaintenance_BadRequest(0x4983e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3896 +0x1c
testing.tRunner(0x4983e00, 0x1d9cd50)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 133 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4983f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServiceMaintenance_Disable(0x4983f40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3980 +0x20
testing.tRunner(0x4983f40, 0x1d9cd54)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 134 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServiceMaintenance_ACLDeny(0x49b4000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4017 +0x1c
testing.tRunner(0x49b4000, 0x1d9cd4c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 135 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b40a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_BadRequest(0x49b40a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4047 +0x1c
testing.tRunner(0x49b40a0, 0x1d9cc74)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 136 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_Enable(0x49b4140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4064 +0x20
testing.tRunner(0x49b4140, 0x1d9cc7c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 137 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b41e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_Disable(0x49b41e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4097 +0x1c
testing.tRunner(0x49b41e0, 0x1d9cc78)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 138 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_ACLDeny(0x49b4280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4122 +0x1c
testing.tRunner(0x49b4280, 0x1d9cc70)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 139 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_Service(0x49b4320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4143 +0x1c
testing.tRunner(0x49b4320, 0x1d9ccd4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 141 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Monitor_ACLDeny(0x49b4460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4259 +0x1c
testing.tRunner(0x49b4460, 0x1d9cc5c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 142 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Token(0x49b4500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4277 +0x20
testing.tRunner(0x49b4500, 0x1d9cd94)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 143 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b45a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCARoots_empty(0x49b45a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4535 +0x1c
testing.tRunner(0x49b45a0, 0x1d9cb44)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 144 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCARoots_list(0x49b4640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4550 +0x20
testing.tRunner(0x49b4640, 0x1d9cb48)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 145 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b46e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclDefaultDeny(0x49b46e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4628 +0x20
testing.tRunner(0x49b46e0, 0x1d9cb24)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 146 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclProxyToken(0x49b4780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4665 +0x20
testing.tRunner(0x49b4780, 0x1d9cb2c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 147 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclProxyTokenOther(0x49b4820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4711 +0x20
testing.tRunner(0x49b4820, 0x1d9cb28)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 148 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b48c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclServiceWrite(0x49b48c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4776 +0x20
testing.tRunner(0x49b48c0, 0x1d9cb34)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 149 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclServiceReadDeny(0x49b4960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4834 +0x20
testing.tRunner(0x49b4960, 0x1d9cb30)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 150 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_good(0x49b4a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4889 +0x20
testing.tRunner(0x49b4a00, 0x1d9cb40)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 153 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclDefaultDeny(0x49b4be0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5335 +0x20
testing.tRunner(0x49b4be0, 0x1d9cb54)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 154 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclProxyToken(0x49b4c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5371 +0x20
testing.tRunner(0x49b4c80, 0x1d9cb58)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 155 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclServiceWrite(0x49b4d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5419 +0x20
testing.tRunner(0x49b4d20, 0x1d9cb60)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 156 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclServiceReadDeny(0x49b4dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5478 +0x20
testing.tRunner(0x49b4dc0, 0x1d9cb5c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 158 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_badBody(0x49b4f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5888 +0x20
testing.tRunner(0x49b4f00, 0x1d9cb00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 159 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b4fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_noTarget(0x49b4fa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5908 +0x20
testing.tRunner(0x49b4fa0, 0x1d9cb1c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 160 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b5040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_idInvalidFormat(0x49b5040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5929 +0x20
testing.tRunner(0x49b5040, 0x1d9cb14)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 161 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b50e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_idNotService(0x49b50e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5953 +0x20
testing.tRunner(0x49b50e0, 0x1d9cb18)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 162 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b5180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_allow(0x49b5180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5977 +0x20
testing.tRunner(0x49b5180, 0x1d9cafc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 163 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b5220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_deny(0x49b5220)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6074 +0x20
testing.tRunner(0x49b5220, 0x1d9cb10)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 164 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b52c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_allowTrustDomain(0x49b52c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6123 +0x20
testing.tRunner(0x49b52c0, 0x1d9caf8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 165 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x49b5360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_denyWildcard(0x49b5360)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6168 +0x20
testing.tRunner(0x49b5360, 0x1d9cb0c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 166 [runnable]:
syscall.Syscall(0x94, 0x7d, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x7d, 0x4000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x9b1f560, 0x4000, 0x4000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*DB).init(0x9b1f560, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/boltdb/bolt/db.go:382 +0x20c
github.com/boltdb/bolt.Open(0x9986cd0, 0x47, 0x180, 0x32c9e00, 0x9986cd0, 0x47, 0x9986cd0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/boltdb/bolt/db.go:199 +0x1c4
github.com/hashicorp/raft-boltdb.New(0x9986cd0, 0x47, 0x0, 0x9986c00, 0x47, 0x0, 0xa05d0f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:66 +0x34
github.com/hashicorp/raft-boltdb.NewBoltStore(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:60
github.com/hashicorp/consul/agent/consul.(*Server).setupRaft(0x9f7cfc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:626 +0x794
github.com/hashicorp/consul/agent/consul.NewServerLogger(0x7d62b40, 0xa1660f0, 0x9ea7220, 0xa166150, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:427 +0x748
github.com/hashicorp/consul/agent.(*Agent).Start(0x8e8adc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:411 +0x3b8
github.com/hashicorp/consul/agent.(*TestAgent).Start(0x9f8a460, 0x49b5400, 0x68)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/testagent.go:164 +0x6c8
github.com/hashicorp/consul/agent.NewTestAgent(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/testagent.go:101
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_serviceWrite(0x49b5400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6252 +0xc0
testing.tRunner(0x49b5400, 0x1d9cb20)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 167 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x17d7840, 0x0)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/sdk/testutil/retry.(*Timer).NextOr(0x9ed4ff0, 0x9f4e580, 0xa084c40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/sdk/testutil/retry/retry.go:205 +0x1a8
github.com/hashicorp/consul/sdk/testutil/retry.run(0x20530f0, 0x9ed4ff0, 0x205df50, 0x49b54a0, 0x9f4e560)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/sdk/testutil/retry/retry.go:125 +0xa0
github.com/hashicorp/consul/sdk/testutil/retry.Run(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/sdk/testutil/retry/retry.go:90
github.com/hashicorp/consul/agent.(*TestAgent).Start(0x9263f90, 0x49b54a0, 0x30fd2a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/testagent.go:196 +0xb30
github.com/hashicorp/consul/agent.NewTestAgent(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/testagent.go:101
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_defaultDeny(0x49b54a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6290 +0xc0
testing.tRunner(0x49b54a0, 0x1d9cb08)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 168 [chan receive]:
github.com/hashicorp/serf/serf.(*Snapshotter).Wait(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:190
github.com/hashicorp/serf/serf.(*Serf).Shutdown(0x99b6240, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:863 +0xf4
github.com/hashicorp/consul/agent/consul.(*Server).Shutdown(0x8d85680, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:816 +0x210
github.com/hashicorp/consul/agent.(*Agent).ShutdownAgent(0x8e8a000, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1687 +0x578
github.com/hashicorp/consul/agent.(*TestAgent).Shutdown(0x94b0230, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/testagent.go:247 +0x60
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_defaultAllow(0x49b5540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6340 +0x49c
testing.tRunner(0x49b5540, 0x1d9cb04)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 169 [chan send]:
testing.tRunner.func1(0x49b55e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b55e0, 0x1d9cc0c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 170 [chan send]:
testing.tRunner.func1(0x49b5680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5680, 0x1d9cc08)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 955 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x2a05f200, 0x1)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).Leave(0x7814240, 0x4beb9a0, 0x88ce500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:902 +0x1e0
github.com/hashicorp/consul/agent.(*Agent).Leave(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1612
github.com/hashicorp/consul/agent.TestAgent_StartStop(0x4beb9a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent_test.go:121 +0xe4
testing.tRunner(0x4beb9a0, 0x1d9cd88)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1133 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x51aa580, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4bf4480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4bf4480, 0x49d53c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x51d2740, 0x4bf4480, 0x49d53c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1661 [chan send, 2 minutes]:
testing.tRunner.func1(0x4e0e320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e320, 0x1d9cf68)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 962 [chan send]:
testing.tRunner.func1(0x4bebe00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebe00, 0x1d9cbb0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 958 [chan send]:
testing.tRunner.func1(0x4bebb80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebb80, 0x1d9ccb4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1000 [chan send]:
testing.tRunner.func1(0x48cec80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cec80, 0x1d9cdc4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 999 [chan send]:
testing.tRunner.func1(0x48cebe0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cebe0, 0x1d9cdcc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 956 [chan send]:
testing.tRunner.func1(0x4beba40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beba40, 0x1d9ccac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 965 [chan send]:
testing.tRunner.func1(0x49b43c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b43c0, 0x1d9cd38)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1201 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936be0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936be0, 0x1d9cdf0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1699 [chan send, 2 minutes]:
testing.tRunner.func1(0x49360a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49360a0, 0x1d9cfc8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7499 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x5c4c900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7671 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531ee38, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c353c4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c353b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c353b0, 0x4ba01e0, 0xb6d996d0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x50e8b50, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x50e8b50, 0x4d9c700, 0x5c353b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x491f880, 0x206b9a8, 0x50e8b50, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x491f880, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5a1d9b0, 0x1cac2e9, 0x3, 0x4ec7620, 0xf, 0x50e8b40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4dd4c60, 0x5a1d9b0, 0x496b800, 0x496b840, 0x205db30, 0x4a780e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 985 [chan send]:
testing.tRunner.func1(0x48ce320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce320, 0x1d9cc90)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7721 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccc840, 0x1dcd6500, 0x0, 0x514ac00, 0x5458980, 0x5b48420)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 6858 [chan send, 4 minutes]:
testing.tRunner.func1(0x5989040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5989040, 0x1d9d180)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 981 [chan send]:
testing.tRunner.func1(0x49b5ea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5ea0, 0x1d9cd2c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7053 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x4b54cd0, 0x1cd453e, 0x17, 0x2053030, 0x5c5a720, 0xa, 0x0, 0x1a03340, 0x50f8ea0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4b54cd0, 0x20732c8, 0x5532320, 0x1cd453e, 0x17, 0x2053030, 0x5c5a720, 0x5cff440, 0x13, 0x5a2b1c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 966 [chan send]:
testing.tRunner.func1(0x49b4aa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b4aa0, 0x1d9cd34)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1233 [chan send, 1 minutes]:
testing.tRunner.func1(0x4e0e000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e000, 0x1d9ce2c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 984 [chan send]:
testing.tRunner.func1(0x48ce280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce280, 0x1d9ce08)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7550 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x5cee790)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 1694 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfcc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfcc0, 0x1d9cfb0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1199 [chan send, 4 minutes]:
testing.tRunner.func1(0x4936aa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936aa0, 0x1d9cde8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1213 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937360, 0x1d9ce54)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 954 [select, 6 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x513fea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 1666 [chan send, 2 minutes]:
testing.tRunner.func1(0x4beb7c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beb7c0, 0x1d9cf88)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1696 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfe00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfe00, 0x1d9cf6c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1343 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4ea6940, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x499cfc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x499cfc0, 0x4836d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4ec9400, 0x499cfc0, 0x4836d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1518 [chan send, 1 minutes]:
testing.tRunner.func1(0x4beb0e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beb0e0, 0x1d9cefc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1210 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937180)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937180, 0x1d9cd20)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1146 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50a00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50a00, 0x1d9ce88)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 988 [chan send]:
testing.tRunner.func1(0x48ce500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce500, 0x1d9cca4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 989 [chan send]:
testing.tRunner.func1(0x48ce5a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce5a0, 0x1d9cc8c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1249 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51860, 0x1d9cf2c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7681 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4ea2900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7676 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4986c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x4986c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4986c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4986c00, 0x5b480f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1207 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936fa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936fa0, 0x1d9cdd8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7517 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x5c4cea0, 0x1cae9dd, 0x6, 0x5c0d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 1206 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936f00, 0x1d9cd84)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7474 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5273540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionsForNode(0x5273540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:554 +0x1c
testing.tRunner(0x5273540, 0x1d9d2f4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1234 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50f00, 0x1d9cea4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1006 [chan send]:
testing.tRunner.func1(0x48cf040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf040, 0x1d9cd60)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 807 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4bc34e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523a240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523a240, 0x4fb1500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4910f20, 0x523a240, 0x4fb1500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 634 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4ea7dc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523a000, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523a000, 0x48532c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x516a750, 0x523a000, 0x48532c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1690 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfa40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfa40, 0x1d9cf50)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1705 [chan send, 2 minutes]:
testing.tRunner.func1(0x4936500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936500, 0x1d9cfc0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1704 [chan send, 2 minutes]:
testing.tRunner.func1(0x49363c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49363c0, 0x1d9cff8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 996 [chan send]:
testing.tRunner.func1(0x48cea00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cea00, 0x1d9cdfc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1856 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f4a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f4a0, 0x1d9d024)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6854 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988dc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988dc0, 0x1d9cc44)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1197 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936960)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936960, 0x1d9cde0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 974 [chan send]:
testing.tRunner.func1(0x49b5a40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5a40, 0x1d9cb7c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7116 [chan receive, 5 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4e44120, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5ac4000, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac4000, 0x54eab00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x54183b0, 0x5ac4000, 0x54eab00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7661 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x496b740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 1888 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0fea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0fea0, 0x1d9cef0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1877 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f7c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f7c0, 0x1d9cee0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7619 [select]:
github.com/hashicorp/yamux.(*Session).send(0x594b7a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7678 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4986c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4986c00, 0x5b48108)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1674 [chan send, 1 minutes]:
testing.tRunner.func1(0x48ce1e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce1e0, 0x1d9cf94)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7482 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x5c91270)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 1519 [chan send, 1 minutes]:
testing.tRunner.func1(0x4beb5e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beb5e0, 0x1d9cef8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1204 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936dc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936dc0, 0x1d9cd30)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1691 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfae0, 0x1d9cf54)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6859 [chan send, 4 minutes]:
testing.tRunner.func1(0x59890e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59890e0, 0x1d9d184)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6666 [chan send, 4 minutes]:
testing.tRunner.func1(0x4b51900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51900, 0x1d9d1f8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6898 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites_ACLAgentToken(0x598e960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:189 +0x1c
testing.tRunner(0x598e960, 0x1d9d294)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7626 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x5bfa370)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 1875 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f680, 0x1d9d01c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 961 [chan send]:
testing.tRunner.func1(0x4bebd60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebd60, 0x1d9cddc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6673 [chan send, 4 minutes]:
testing.tRunner.func1(0x59880a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59880a0, 0x1d9d1f4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7675 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5d54180, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4cdad80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7720 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4ccc840, 0x5458980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 1851 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f180)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f180, 0x1d9cecc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1211 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937220, 0x1d9cdac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1205 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936e60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936e60, 0x1d9ccb0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7541 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x5163600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5163600, 0x4970e38)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 986 [chan send]:
testing.tRunner.func1(0x48ce3c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce3c0, 0x1d9cde4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 978 [chan send]:
testing.tRunner.func1(0x49b5cc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5cc0, 0x1d9cb68)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1001 [chan send]:
testing.tRunner.func1(0x48ced20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ced20, 0x1d9cdc8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 971 [chan send]:
testing.tRunner.func1(0x49b5860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5860, 0x1d9cb88)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7653 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4a16c60, 0x53edd00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 1198 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936a00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936a00, 0x1d9cdb0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1152 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50dc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50dc0, 0x1d9ce94)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1247 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51720, 0x1d9cebc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1003 [chan send]:
testing.tRunner.func1(0x48cee60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cee60, 0x1d9cdc0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1698 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cff40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cff40, 0x1d9cfbc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7511 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x48db810)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 969 [chan send]:
testing.tRunner.func1(0x49b5720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5720, 0x1d9cb90)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6876 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_AutopilotGetConfiguration(0x5989b80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:293 +0x20
testing.tRunner(0x5989b80, 0x1d9d19c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1855 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f400)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f400, 0x1d9d020)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1002 [chan send]:
testing.tRunner.func1(0x48cedc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cedc0, 0x1d9ce04)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1217 [chan send, 1 minutes]:
testing.tRunner.func1(0x49375e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49375e0, 0x1d9ce48)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1192 [chan send]:
testing.tRunner.func1(0x4936640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936640, 0x1d9cbac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1305 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x514fc80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523a6c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523a6c0, 0x50f6340)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4e38080, 0x523a6c0, 0x50f6340)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1005 [chan send]:
testing.tRunner.func1(0x48cefa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cefa0, 0x1d9ce00)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7668 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x4dd4c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 1240 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b512c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b512c0, 0x1d9d280)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1239 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51220, 0x1d9ceb0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1242 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51400)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51400, 0x1d9cf4c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1669 [chan send, 1 minutes]:
testing.tRunner.func1(0x4982640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4982640, 0x1d9d008)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1878 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f860, 0x1d9cee8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1144 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b508c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b508c0, 0x1d9ce28)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 963 [chan send]:
testing.tRunner.func1(0x4bebea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebea0, 0x1d9cba4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1216 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937540, 0x1d9ce50)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1215 [chan send, 1 minutes]:
testing.tRunner.func1(0x49374a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49374a0, 0x1d9ce20)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1209 [chan send, 1 minutes]:
testing.tRunner.func1(0x49370e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49370e0, 0x1d9cd18)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7492 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e808, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac35a4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5ac3590, 0x50b0000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5ac3590, 0x50b0000, 0x10000, 0x10000, 0x0, 0x32bc201, 0x1, 0x0, 0x1)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5d32d58, 0x50b0000, 0x10000, 0x10000, 0x4ed9734, 0x101, 0x4ed9708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5d32d58, 0x50b0000, 0x10000, 0x10000, 0x2, 0x1, 0x0, 0x2054770, 0x50dd560)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x510c740, 0x5d32d58)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7672 [IO wait]:
internal/poll.runtime_pollWait(0xa531e910, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c35414, 0x72, 0xff00, 0xffff, 0x8d99230)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5c35400, 0xa100000, 0xffff, 0xffff, 0x8d99230, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5c35400, 0xa100000, 0xffff, 0xffff, 0x8d99230, 0x28, 0x28, 0xa495bb0c, 0x1d3a8, 0xa495bb0c, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4d9c730, 0xa100000, 0xffff, 0xffff, 0x8d99230, 0x28, 0x28, 0xb6d9936c, 0x4d08000, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4d9c730, 0xa100000, 0xffff, 0xffff, 0x8d99230, 0x28, 0x28, 0xb6d9936c, 0x4d08000, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4d9c730, 0xa100000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x491f900, 0x4d9c730, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4d9c758, 0x4d9c730, 0x77359400, 0x0, 0xa035a40, 0x1, 0x0, 0x0, 0x2054770, 0xa035a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x491f900, 0x4d9c730, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x491f900, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5a1d9e0, 0x1cac30d, 0x3, 0x4ec7670, 0xf, 0x50e8b60, 0x5cfd6d0, 0x1)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4dd4c60, 0x5a1d9e0, 0x496b800, 0x496b840, 0x205db48, 0x4a78120)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 1148 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50b40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50b40, 0x1d9ce80)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1237 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b510e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b510e0, 0x1d9ceac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1244 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51540, 0x1d9ced8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6737 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd540, 0x1d9d2fc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1845 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0edc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0edc0, 0x1d9cec8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7724 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x53aa000, 0x1cae9dd, 0x6, 0x4ea7280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 1194 [chan send]:
testing.tRunner.func1(0x4936780)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936780, 0x1d9cc6c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7504 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5c0d7c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 1218 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937680, 0x1d9ce44)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1668 [chan send, 1 minutes]:
testing.tRunner.func1(0x4beb900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beb900, 0x1d9d00c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7651 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4a16c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 975 [chan send]:
testing.tRunner.func1(0x49b5ae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5ae0, 0x1d9cb80)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6750 [chan send, 4 minutes]:
testing.tRunner.func1(0x5cbdd60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbdd60, 0x1d9d1f0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1230 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937e00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937e00, 0x1d9ce1c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6747 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbdb80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbdb80, 0x1d9d0b8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6857 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988fa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988fa0, 0x1d9cb64)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1700 [chan send, 2 minutes]:
testing.tRunner.func1(0x4936140)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936140, 0x1d9cfcc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7725 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x53aa000, 0x1cad4b5, 0x5, 0x4ea7300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7436 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x4ff36c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7177 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0ed20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestFixupLockDelay(0x4e0ed20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:227 +0x1c
testing.tRunner(0x4e0ed20, 0x1d9d06c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1849 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f040, 0x1d9ced4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1245 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b515e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b515e0, 0x1d9cf30)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1246 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51680, 0x1d9cf1c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1238 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51180)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51180, 0x1d9ceb4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 991 [chan send]:
testing.tRunner.func1(0x48ce6e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce6e0, 0x1d9cc9c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1153 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50e60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50e60, 0x1d9ce8c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7663 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x5c46000, 0x0, 0x1d9d4d8, 0x4fa17a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 1004 [chan send]:
testing.tRunner.func1(0x48cef00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cef00, 0x1d9cdbc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7178 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0ee60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionDestroy(0x4e0ee60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:309 +0x1c
testing.tRunner(0x4e0ee60, 0x1d9d2cc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 972 [chan send]:
testing.tRunner.func1(0x49b5900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5900, 0x1d9cb8c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1149 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50be0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50be0, 0x1d9ce78)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7440 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e784, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4bfc3d4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4bfc3c0, 0x55fc000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4bfc3c0, 0x55fc000, 0x10000, 0x10000, 0x54800, 0x5240b01, 0x1, 0x0, 0x15c90e0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5910760, 0x55fc000, 0x10000, 0x10000, 0x4dad734, 0x101, 0x4dad708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5910760, 0x55fc000, 0x10000, 0x10000, 0x54d4c40, 0x54d4c40, 0x54d4c40, 0x32bc2a8, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x496b400, 0x5910760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7666 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x5c46000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 990 [chan send]:
testing.tRunner.func1(0x48ce640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce640, 0x1d9cca0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7717 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4ccc840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7716 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4ccc840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 980 [chan send]:
testing.tRunner.func1(0x49b5e00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5e00, 0x1d9cb6c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 998 [chan send]:
testing.tRunner.func1(0x48ceb40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ceb40, 0x1d9cdd0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1200 [chan send, 3 minutes]:
testing.tRunner.func1(0x4936b40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936b40, 0x1d9cbec)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 982 [chan send]:
testing.tRunner.func1(0x49b5f40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5f40, 0x1d9cbfc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6774 [chan send, 4 minutes]:
testing.tRunner.func1(0x5cb7900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cb7900, 0x1d9d200)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 997 [chan send]:
testing.tRunner.func1(0x48ceaa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ceaa0, 0x1d9cdd4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1195 [chan send]:
testing.tRunner.func1(0x4936820)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936820, 0x1d9cda8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7719 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccc840, 0x2a05f200, 0x1, 0x514abc0, 0x5458980, 0x5b48410)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 1726 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4a1b8c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4e36900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4e36900, 0x5357d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5118370, 0x4e36900, 0x5357d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7667 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x5a1d890, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x4dd4c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 959 [chan send]:
testing.tRunner.func1(0x4bebc20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebc20, 0x1d9ccb8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6665 [chan send, 4 minutes]:
testing.tRunner.func1(0x4b517c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b517c0, 0x1d9d1fc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7495 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x48db600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7540 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x5163600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x5163600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x5163600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5163600, 0x4970e28)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1229 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937d60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937d60, 0x1d9ce58)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 994 [chan send]:
testing.tRunner.func1(0x48ce8c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce8c0, 0x1d9cc94)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7513 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x48db810, 0x4f59ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 973 [chan send]:
testing.tRunner.func1(0x49b59a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b59a0, 0x1d9cb78)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7502 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x5c4c900, 0x1cad4b5, 0x5, 0x5ca18e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 979 [chan send]:
testing.tRunner.func1(0x49b5d60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5d60, 0x1d9cb70)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 960 [chan send]:
testing.tRunner.func1(0x4bebcc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebcc0, 0x1d9cdf4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1509 [semacquire, 6 minutes]:
sync.runtime_Semacquire(0x5150dec)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x5150dec)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x5150d90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x48cab40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:362 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x48cab40, 0x51f6fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:180 +0x694
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4ec6110, 0x48cab40, 0x51f6fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1900 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272640, 0x1d9d0d0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7556 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4d7b9e0, 0x1cae9dd, 0x6, 0x53a48a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7658 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54de000, 0x1cad4b5, 0x5, 0x4cd9260)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 6838 [chan send, 4 minutes]:
testing.tRunner.func1(0x59883c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59883c0, 0x1d9d14c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1664 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0e500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e500, 0x1d9cfd8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1857 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f540, 0x1d9d02c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1670 [chan send, 1 minutes]:
testing.tRunner.func1(0x4982780)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4982780, 0x1d9cf90)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1847 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0ef00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0ef00, 0x1d9cf48)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1573 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x51202c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x5150d90, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5150d90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 1692 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfb80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfb80, 0x1d9cfd0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1889 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0ff40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0ff40, 0x1d9cf80)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7438 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5bf6630)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7503 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x5c4c900, 0x1cad6ef, 0x5, 0x5ca1900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 1231 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937ea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937ea0, 0x1d9ce18)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6843 [chan send, 4 minutes]:
testing.tRunner.func1(0x59886e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59886e0, 0x1d9d12c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7491 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e88c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3554, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac3540, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac3540, 0x3, 0x3, 0x3cc)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x58f58c0, 0xb6d996d0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x58f58c0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x510c740, 0x58f58c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 1196 [chan send, 1 minutes]:
testing.tRunner.func1(0x49368c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49368c0, 0x1d9cdb4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6744 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd9a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd9a0, 0x1d9d0a0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1675 [chan send, 2 minutes]:
testing.tRunner.func1(0x48ce820)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce820, 0x1d9d000)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6885 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Explain(0x598e140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:674 +0x1c
testing.tRunner(0x598e140, 0x1d9d24c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 987 [chan send]:
testing.tRunner.func1(0x48ce460)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce460, 0x1d9cca8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6748 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbdc20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbdc20, 0x1d9d27c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1223 [chan send, 1 minutes]:
testing.tRunner.func1(0x49379a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49379a0, 0x1d9ce70)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 633 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5242dc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4e378c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4e378c0, 0x48530c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4ad80e0, 0x4e378c0, 0x48530c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1208 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937040, 0x1d9cd1c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 967 [chan send]:
testing.tRunner.func1(0x49b4b40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b4b40, 0x1d9cc18)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 612 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4de5a80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4bf5d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4bf5d40, 0x5274300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4910150, 0x4bf5d40, 0x5274300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1222 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937900, 0x1d9ce74)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1695 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfd60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfd60, 0x1d9cf70)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1212 [chan send, 1 minutes]:
testing.tRunner.func1(0x49372c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49372c0, 0x1d9ce14)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1202 [chan send, 1 minutes]:
testing.tRunner.func1(0x4936c80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936c80, 0x1d9cdec)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 419 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4de44c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4e36480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4e36480, 0x4d90f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4ec6260, 0x4e36480, 0x4d90f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 6875 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_Keyring_InvalidRelayFactor(0x5989ae0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:267 +0x20
testing.tRunner(0x5989ae0, 0x1d9d1b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 964 [chan send]:
testing.tRunner.func1(0x4bebf40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebf40, 0x1d9cba8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1890 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272000, 0x1d9d048)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1688 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf900, 0x1d9cedc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7659 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54de000, 0x1cad6ef, 0x5, 0x4cd9280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 1702 [chan send, 2 minutes]:
testing.tRunner.func1(0x4936280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4936280, 0x1d9ce10)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1685 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf720, 0x1d9cfe8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7674 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4a7a900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 7601 [select, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*state).run(0x5096540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:220 +0x1c0
created by github.com/hashicorp/consul/agent/proxycfg.(*state).Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:106 +0xbc

goroutine 1682 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf4a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf4a0, 0x1d9cf14)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6738 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd5e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd5e0, 0x1d9d300)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6894 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e6e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec_ACLAgentToken(0x598e6e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:117 +0x1c
testing.tRunner(0x598e6e0, 0x1d9d284)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7650 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4a16c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 1686 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf7c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf7c0, 0x1d9cfe0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7452 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x506ff40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x506ff40, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x506ff40, 0x1cb2cd8, 0x8, 0x52e3fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x506ff40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 1151 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50d20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50d20, 0x1d9ce90)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1521 [chan send, 1 minutes]:
testing.tRunner.func1(0x4beb720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beb720, 0x1d9cf84)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6861 [chan send, 4 minutes]:
testing.tRunner.func1(0x5989220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5989220, 0x1d9d168)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7174 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0eb40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate_Delete(0x4e0eb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:98 +0x1c
testing.tRunner(0x4e0eb40, 0x1d9d2b8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1520 [chan send, 1 minutes]:
testing.tRunner.func1(0x4beb680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4beb680, 0x1d9cf8c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7509 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x48db810)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 1701 [chan send, 2 minutes]:
testing.tRunner.func1(0x49361e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49361e0, 0x1d9cfd4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 968 [chan send]:
testing.tRunner.func1(0x49b4e60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b4e60, 0x1d9cb98)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7723 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x53aa000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 1876 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f720, 0x1d9ceec)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7600 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x5bfa370, 0x1cc0a02, 0xf, 0x2052fe8, 0x4fb4060, 0x1, 0x0, 0x1828bc8, 0x5bd1650, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5bfa370, 0x20732c8, 0x5bdf2c0, 0x1cc0a02, 0xf, 0x2052fe8, 0x4fb4060, 0x1cb690f, 0xa, 0x52dce80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 1214 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937400)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937400, 0x1d9ce24)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1677 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf180)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf180, 0x1d9cef4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6860 [chan send, 4 minutes]:
testing.tRunner.func1(0x5989180)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5989180, 0x1d9d16c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1879 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f900, 0x1d9cee4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1850 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f0e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f0e0, 0x1d9ced0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1147 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50aa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50aa0, 0x1d9ce84)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6735 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd400)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd400, 0x1d9d0ac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1193 [chan send]:
testing.tRunner.func1(0x49366e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49366e0, 0x1d9cb94)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 977 [chan send]:
testing.tRunner.func1(0x49b5c20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5c20, 0x1d9cb74)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1226 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937b80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937b80, 0x1d9ce60)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1228 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937cc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
runtime.Goexit()
	/usr/lib/go-1.13/src/runtime/panic.go:563 +0x104
testing.(*common).FailNow(0x4937cc0)
	/usr/lib/go-1.13/src/testing/testing.go:653 +0x2c
testing.(*common).Fatalf(0x4937cc0, 0x1cb0bbe, 0x7, 0x8a8df68, 0x1, 0x1)
	/usr/lib/go-1.13/src/testing/testing.go:716 +0x6c
github.com/hashicorp/consul/agent.TestCatalogServiceNodes_DistanceSort(0x4937cc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:922 +0xb44
testing.tRunner(0x4937cc0, 0x1d9ce5c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6835 [chan send, 4 minutes]:
testing.tRunner.func1(0x59881e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59881e0, 0x1d9d13c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1225 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937ae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937ae0, 0x1d9ce64)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1236 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b51040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51040, 0x1d9cea8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7437 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5bf6630)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7715 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531ef40, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac38c4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5ac38b0, 0x549e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5ac38b0, 0x549e000, 0x10000, 0x10000, 0x5a02e00, 0x12b01, 0x1b801, 0x0, 0x41df0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5b48178, 0x549e000, 0x10000, 0x10000, 0x5a02f34, 0x101, 0x5a02f08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5b48178, 0x549e000, 0x10000, 0x10000, 0x52dab20, 0x54d5b80, 0xe289a0, 0x1d90000, 0x5a02f50)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x514a980, 0x5b48178)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 6670 [chan send, 4 minutes]:
testing.tRunner.func1(0x4b51ea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51ea0, 0x1d9d1e4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6667 [chan send, 4 minutes]:
testing.tRunner.func1(0x4b519a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b519a0, 0x1d9d1e8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1224 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937a40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937a40, 0x1d9ce6c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1874 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f5e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f5e0, 0x1d9d028)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7057 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5a2b240, 0x5d318b0, 0x20732c8, 0x5532fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7548 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x5cee790)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 1221 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
runtime.Goexit()
	/usr/lib/go-1.13/src/runtime/panic.go:563 +0x104
testing.(*common).FailNow(0x4937860)
	/usr/lib/go-1.13/src/testing/testing.go:653 +0x2c
testing.(*common).Fatalf(0x4937860, 0x1cb0bbe, 0x7, 0x6d31f68, 0x1, 0x1)
	/usr/lib/go-1.13/src/testing/testing.go:716 +0x6c
github.com/hashicorp/consul/agent.TestCatalogNodes_DistanceSort(0x4937860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:437 +0xa5c
testing.tRunner(0x4937860, 0x1d9ce40)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 995 [chan send]:
testing.tRunner.func1(0x48ce960)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce960, 0x1d9cdb8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1220 [chan send, 1 minutes]:
testing.tRunner.func1(0x49377c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49377c0, 0x1d9ce3c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1697 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cfea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cfea0, 0x1d9cf74)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 992 [chan send]:
testing.tRunner.func1(0x48ce780)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48ce780, 0x1d9cc88)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 957 [chan send]:
testing.tRunner.func1(0x4bebae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4bebae0, 0x1d9cd90)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 970 [chan send]:
testing.tRunner.func1(0x49b57c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b57c0, 0x1d9cb84)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7660 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x5c46000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 7726 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x53aa000, 0x1cad6ef, 0x5, 0x4ea76c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7506 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5cc2a20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 1232 [chan send, 1 minutes]:
testing.tRunner.func1(0x4937f40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4937f40, 0x1d9ce34)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1150 [chan send, 1 minutes]:
testing.tRunner.func1(0x4b50c80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b50c80, 0x1d9ce7c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 611 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x496c540, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4a7afc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a7afc0, 0x5274200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4bd5520, 0x4a7afc0, 0x5274200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 6777 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cb7ae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cb7ae0, 0x1d9d1d0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7498 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x48db600, 0x1dcd6500, 0x0, 0x510c980, 0x4f59540, 0x5d32ff0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 6836 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988280, 0x1d9d140)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7505 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5cc2a20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 1689 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf9a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf9a0, 0x1d9cff4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6839 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988460)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988460, 0x1d9d148)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7090 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5a41700, 0x4e139f4, 0x20, 0x20, 0x1, 0x535f990)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5a41620, 0x20732c8, 0x5a41700, 0x5d0a9e0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5a41620, 0x5d45d80, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4a7a6c0, 0x5d584f8, 0x51d8f0c, 0x4e13b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x549d128, 0x5d584e0, 0x51d8f00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x549d178, 0x13, 0x1cac698, 0x4, 0x4e13d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x549d178, 0x13, 0x50c754c, 0x3, 0x3, 0x69701, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5bbd920, 0x59797d0, 0x548a418, 0x0, 0x547a2d0, 0x5a41160, 0x1b47298, 0x5d584e0, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x59797d0, 0x2073568, 0x5a415c0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4a7a6c0, 0x1cbfad5, 0xf, 0x1b47298, 0x4efd8c0, 0x1828bc8, 0x51d8ed0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4ef29a0, 0x1cbfad5, 0xf, 0x1b47298, 0x4efd8c0, 0x1828bc8, 0x51d8ed0, 0x51d8ea4, 0x32b9d50)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x5c929d8, 0x1, 0x0, 0xb2c97000, 0x8b, 0x5a415a0, 0x2052fe8, 0x4efd8c0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x5c929d8, 0x5cd5820, 0x1828bc8, 0x4a8d2f0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 6881 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989ea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Create(0x5989ea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:77 +0x20
testing.tRunner(0x5989ea0, 0x1d9d204)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1461 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4fb59e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x4bf4b40, 0x4cc4d0d, 0x1bcef80, 0x4d4d700, 0x0, 0x0, 0x7d278, 0x48a3ef4)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:370 +0x120
github.com/hashicorp/consul/agent/consul.(*Server).initializeCAConfig(0x4bf4b40, 0x0, 0x0, 0x4cc4fbc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:996 +0xac
github.com/hashicorp/consul/agent/consul.(*Server).initializeCA(0x4bf4b40, 0x51fab80, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader_oss.go:13 +0x2c
github.com/hashicorp/consul/agent/consul.(*Server).establishLeadership(0x4bf4b40, 0x2, 0x2)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:324 +0x134
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4bf4b40, 0x4eb4880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:176 +0x624
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x484a6d0, 0x4bf4b40, 0x4eb4880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1462 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4994ac0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*AutopilotDelegate).PromoteNonVoters(0x51968c8, 0x4b96f08, 0x0, 0x0, 0x0, 0x0, 0x0, 0x51fa700, 0x4b6f7b4, 0x4995600, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot.go:69 +0x40
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).promoteServers(0x48a3ea0, 0x4b6f744, 0x3)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:140 +0x120
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x48a3ea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:112 +0x198
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 6863 [chan send, 4 minutes]:
testing.tRunner.func1(0x5989360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5989360, 0x1d9d164)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6749 [chan send, 4 minutes]:
testing.tRunner.func1(0x5cbdcc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbdcc0, 0x1d9d278)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6837 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988320, 0x1d9d144)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6671 [chan send, 4 minutes]:
testing.tRunner.func1(0x4b51f40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4b51f40, 0x1d9ca90)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1463 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x497e880, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x48a3ea0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x48a3ea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 6889 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e3c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_parseLimit(0x598e3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:961 +0x1c
testing.tRunner(0x598e3c0, 0x1d9d274)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7486 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x5a46200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46200, 0x5d32ce0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6896 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites(0x598e820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:173 +0x1c
testing.tRunner(0x598e820, 0x1d9d2a0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7545 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5104870)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 6849 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988aa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988aa0, 0x1d9d154)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1853 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f2c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f2c0, 0x1d9cf10)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7484 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5b03590, 0x5a75d00, 0x5964540, 0x5a75d00, 0x5a75c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x50cc380)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7441 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4a16c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7471 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x506ff90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7181 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0fae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionGet(0x4e0fae0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:460 +0x1c
testing.tRunner(0x4e0fae0, 0x1d9d2d8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7173 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0eaa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate(0x4e0eaa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:41 +0x1c
testing.tRunner(0x4e0eaa0, 0x1d9d2c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6844 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988780)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988780, 0x1d9d134)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7662 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x507c180, 0x51e2bc0, 0x1cac33a, 0x3, 0x5459440, 0x51e2b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 6900 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598eaa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHandleRemoteExec(0x598eaa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:330 +0x1c
testing.tRunner(0x598eaa0, 0x1d9d0c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1657 [chan send, 2 minutes]:
testing.tRunner.func1(0x4e0e0a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e0a0, 0x1d9cfa8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1897 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272460)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272460, 0x1d9d344)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1663 [chan send, 2 minutes]:
testing.tRunner.func1(0x4e0e460)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e460, 0x1d9cfdc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1671 [chan send, 1 minutes]:
testing.tRunner.func1(0x4983860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4983860, 0x1d9d004)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6855 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988e60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988e60, 0x1d9cc24)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1684 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf680, 0x1d9cfe4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1896 [chan send, 3 minutes]:
testing.tRunner.func1(0x52723c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52723c0, 0x1d9d058)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1892 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272140)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272140, 0x1d9d060)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1891 [chan send, 3 minutes]:
testing.tRunner.func1(0x52720a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52720a0, 0x1d9d044)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1894 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272280, 0x1d9d04c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1899 [chan send, 3 minutes]:
testing.tRunner.func1(0x52725a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52725a0, 0x1d9d0d4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1895 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272320, 0x1d9d054)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1893 [chan send, 3 minutes]:
testing.tRunner.func1(0x52721e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52721e0, 0x1d9d05c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6741 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd7c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd7c0, 0x1d9d08c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1659 [chan send, 2 minutes]:
testing.tRunner.func1(0x4e0e1e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e1e0, 0x1d9cf34)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1660 [chan send, 2 minutes]:
testing.tRunner.func1(0x4e0e280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e280, 0x1d9cff0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1673 [chan send, 1 minutes]:
testing.tRunner.func1(0x49b5b80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x49b5b80, 0x1d9cf9c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7062 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5d45d80, 0x5d0a9e0, 0x20732c8, 0x5a41700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 1683 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf540, 0x1d9cf08)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7473 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x51adb60, 0x20001, 0x10000, 0x4, 0x5b80000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x491e700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 1665 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0e5a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0e5a0, 0x1d9cfb8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6856 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988f00, 0x1d9cc1c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1678 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf220, 0x1d9cf00)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8225 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x59cc960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestStatusPeers(0x59cc960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/status_endpoint_test.go:29 +0x1c
testing.tRunner(0x59cc960, 0x1d9d31c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7496 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x48db600, 0x2a05f200, 0x1, 0x510c940, 0x4f59540, 0x5d32fe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 1680 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf360, 0x1d9cf04)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1679 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf2c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf2c0, 0x1d9cf0c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1852 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f220, 0x1d9cf60)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1569 [chan receive, 6 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x52dba00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523a480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523a480, 0x51fe080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x53a7480, 0x523a480, 0x51fe080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 1676 [chan send, 2 minutes]:
testing.tRunner.func1(0x48cf0e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x48cf0e0, 0x1d9cfa0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7718 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4ccc840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7618 [IO wait]:
internal/poll.runtime_pollWait(0xa531ea9c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599c884, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x599c870, 0x5471000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x599c870, 0x5471000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5b90768, 0x5471000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5d49ad0, 0x4ec6cd0, 0xc, 0xc, 0x594b880, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5d49ad0, 0x4ec6cd0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x594b7a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x594b7a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 1854 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0f360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0f360, 0x1d9cf58)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6899 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598ea00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites_ACLDeny(0x598ea00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:200 +0x1c
testing.tRunner(0x598ea00, 0x1d9d298)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6847 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988960)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988960, 0x1d9d15c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6746 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbdae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbdae0, 0x1d9ce9c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7657 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54de000, 0x1cae9dd, 0x6, 0x4cd9240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 6740 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd720, 0x1d9d308)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7472 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4e63cc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7060 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5a41580, 0x4cca974, 0x20, 0x20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5a413e0, 0x20732c8, 0x5a41580, 0x5d0a968, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5a413e0, 0x5d45d40, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4a7a6c0, 0x5cbc198, 0x51d8dec, 0x4ccab50, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Health).ServiceNodes(0x549d0c8, 0x5cbc140, 0x51d8de0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/health_endpoint.go:185 +0x188
reflect.Value.call(0x4aab7c0, 0x549d108, 0x13, 0x1cac698, 0x4, 0x4ccad4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab7c0, 0x549d108, 0x13, 0x4e6a54c, 0x3, 0x3, 0x2a201, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5bbd860, 0x59797d0, 0x548a398, 0x0, 0x547a0f0, 0x5a41380, 0x1b9a700, 0x5cbc140, 0x16, 0x1828b28, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x59797d0, 0x2073568, 0x5a41360, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4a7a6c0, 0x1cc8efe, 0x13, 0x1b9a700, 0x48cf400, 0x1828b28, 0x51d8db0, 0xe0faf4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4ef29a0, 0x1cc8efe, 0x13, 0x1b9a700, 0x48cf400, 0x1828b28, 0x51d8db0, 0x51d8d84, 0x32b9d50)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*HealthServices).Fetch(0x5c929e8, 0x4, 0x0, 0xb2c97000, 0x8b, 0x5a41340, 0x2053048, 0x48cf400, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/health_services.go:41 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7d8, 0x5c929e8, 0x5cd5860, 0x1828b28, 0x51d8c00, 0x0, 0x0, 0x0, 0x0, 0x4, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 6882 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_List(0x5989f40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:165 +0x1c
testing.tRunner(0x5989f40, 0x1d9d26c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7544 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5104870)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 1848 [chan send, 3 minutes]:
testing.tRunner.func1(0x4e0efa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4e0efa0, 0x1d9cf44)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7507 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e25c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac36e4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac36d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac36d0, 0x4995a40, 0x0, 0x4d98824)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4ede3b0, 0x4995fc0, 0x5065fac, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4ede3b0, 0x0, 0x1, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x510cd40, 0x4ede3b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 1901 [chan send, 3 minutes]:
testing.tRunner.func1(0x52726e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52726e0, 0x1d9d0cc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1902 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272780)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272780, 0x1d9d0ec)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1903 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272820)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272820, 0x1d9d0e8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1904 [chan send, 3 minutes]:
testing.tRunner.func1(0x52728c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52728c0, 0x1d9d0fc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1905 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272960)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272960, 0x1d9d0f8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1906 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272a00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272a00, 0x1d9d0f4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1907 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272aa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272aa0, 0x1d9d0f0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1908 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272b40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272b40, 0x1d9d114)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1909 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272be0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272be0, 0x1d9d108)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1911 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272d20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272d20, 0x1d9d100)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1913 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272e60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272e60, 0x1d9d110)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1914 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272f00, 0x1d9d0e4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1915 [chan send, 3 minutes]:
testing.tRunner.func1(0x5272fa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5272fa0, 0x1d9d0dc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1916 [chan send, 3 minutes]:
testing.tRunner.func1(0x5273040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5273040, 0x1d9d0e0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1917 [chan send, 3 minutes]:
testing.tRunner.func1(0x52730e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x52730e0, 0x1d9d064)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6734 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd360, 0x1d9d0b0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7494 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x48db600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7490 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5cc2360)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7669 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x4dd4c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 7560 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5104f30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 6866 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_DELETE_ConflictingFlags(0x5989540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:451 +0x20
testing.tRunner(0x5989540, 0x1d9d170)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 4488 [select, 6 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5a88a40, 0x52e5a0c, 0x20, 0x20, 0x549cd60, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5a889a0, 0x20732c8, 0x5a88a40, 0x549cd68, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5a889a0, 0x5a74780, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x499d440, 0x4a0664c, 0x5a7475c, 0x52e5b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x5249a90, 0x4a06620, 0x5a74740, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x4d7e670, 0x13, 0x1cac698, 0x4, 0x52e5d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x4d7e670, 0x13, 0x4b01d4c, 0x3, 0x3, 0xe39e7e01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x544b140, 0x51d80f0, 0x54cd658, 0x0, 0x523ca50, 0x517d620, 0x1b9a5b0, 0x4a06620, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x51d80f0, 0x2073568, 0x5a888e0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x499d440, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x4a06540, 0x1828b00, 0x5a74700, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4e70000, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x4a06540, 0x1828b00, 0x5a74700, 0x59f5144, 0x4bb2b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x490aca0, 0xf, 0x0, 0xb2c97000, 0x8b, 0x5a888c0, 0x2052fd0, 0x4a06540, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x490aca0, 0x52bf840, 0x1828b00, 0x5a74680, 0x0, 0x0, 0x0, 0x0, 0xf, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 7714 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531eb20, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac37d4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac37c0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac37c0, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x50e8190, 0x12d74, 0x7e8ac, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x50e8190, 0x50aa240, 0x0, 0xffffffff)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x514a980, 0x50e8190)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7512 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x48db810, 0x3b9aca00, 0x0, 0x510cf40, 0x4f59ac0, 0x5d332f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 6879 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_ServerHealth(0x5989d60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:415 +0x1c
testing.tRunner(0x5989d60, 0x1d9d1cc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6895 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec_ACLDeny(0x598e780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:128 +0x1c
testing.tRunner(0x598e780, 0x1d9d288)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6883 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Execute(0x598e000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:248 +0x1c
testing.tRunner(0x598e000, 0x1d9d23c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7439 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e994, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4bfc384, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4bfc370, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4bfc370, 0x1cc2519, 0x10, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x52a2460, 0x5c4fbc0, 0x12b58, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x52a2460, 0x0, 0x2e1fec, 0x59dd880)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x496b400, 0x52a2460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 6887 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Update(0x598e280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:835 +0x20
testing.tRunner(0x598e280, 0x1d9d270)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7501 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x5c4c900, 0x1cae9dd, 0x6, 0x5ca18c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7538 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x5a8fb20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 6880 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_ServerHealth_Unhealthy(0x5989e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:447 +0x1c
testing.tRunner(0x5989e00, 0x1d9d1c8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6888 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Delete(0x598e320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:913 +0x20
testing.tRunner(0x598e320, 0x1d9d208)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6886 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e1e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Get(0x598e1e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:769 +0x1c
testing.tRunner(0x598e1e0, 0x1d9d258)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8223 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x59cc820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSnapshot_Options(0x59cc820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/snapshot_endpoint_test.go:60 +0x1c
testing.tRunner(0x59cc820, 0x1d9d310)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7489 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5cc2360)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 6775 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cb79a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cb79a0, 0x1d9d1d8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6745 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbda40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbda40, 0x1d9d078)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7175 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0ebe0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate_DefaultCheck(0x4e0ebe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:154 +0x20
testing.tRunner(0x4e0ebe0, 0x1d9d2b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6893 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec_ACLToken(0x598e640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:106 +0x1c
testing.tRunner(0x598e640, 0x1d9d28c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6871 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringInstall(0x5989860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:81 +0x20
testing.tRunner(0x5989860, 0x1d9d1a4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6842 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988640, 0x1d9d130)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 4489 [select, 6 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5a74780, 0x549cd68, 0x20732c8, 0x5a88a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7056 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5532fc0, 0x49159c4, 0x20, 0x20, 0x24300, 0x4eddd50)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5532900, 0x20732c8, 0x5532fc0, 0x5d318b0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5532900, 0x5a2b240, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4a7a6c0, 0x5c5a85c, 0x50f8ff8, 0x4915b68, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConfigEntry).ResolveServiceConfig(0x549cff0, 0x5c5a840, 0x50f8ff0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/config_endpoint.go:230 +0x1dc
reflect.Value.call(0x4aab5c0, 0x549d050, 0x13, 0x1cac698, 0x4, 0x4915d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab5c0, 0x549d050, 0x13, 0x5260d4c, 0x3, 0x3, 0xfb0e4f01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5bbd640, 0x59797d0, 0x54cc858, 0x0, 0x4b55e00, 0x5ac0300, 0x1b47460, 0x5c5a840, 0x16, 0x1a03340, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x59797d0, 0x2073568, 0x55328a0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4a7a6c0, 0x1cebb70, 0x20, 0x1b47460, 0x5c5a720, 0x1a03340, 0x50f8fc0, 0xe102ac, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4ef29a0, 0x1cebb70, 0x20, 0x1b47460, 0x5c5a720, 0x1a03340, 0x50f8fc0, 0x50f8f94, 0x4bb23c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ResolvedServiceConfig).Fetch(0x5c92a00, 0xa, 0x0, 0xb2c97000, 0x8b, 0x5532880, 0x2053030, 0x5c5a720, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/resolved_service_config.go:41 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b838, 0x5c92a00, 0x5cd58c0, 0x1a03340, 0x50f8ea0, 0x0, 0x0, 0x0, 0x0, 0xa, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 7514 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x48db810, 0xbebc200, 0x0, 0x510cf80, 0x4f59ac0, 0x5d33308)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7061 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5d45d40, 0x5d0a968, 0x20732c8, 0x5a41580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7510 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x48db810)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7628 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x51503f0, 0x1d9d490, 0x4a7a900, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4a7a900, 0x2080d50, 0x5c92028)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4a7a900, 0x2080d50, 0x5c92028, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 6897 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e8c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites_ACLToken(0x598e8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:178 +0x1c
testing.tRunner(0x598e8c0, 0x1d9d29c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7722 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x53aa000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7673 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531edb4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4bfc564, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4bfc550, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4bfc550, 0x1b5c4c0, 0x4bb23c0, 0xb6d99a00)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x52a2c60, 0xbc, 0x18, 0x496d760)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x52a2c60, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x52a2c60, 0x20, 0x1b5c4c0, 0x310001, 0x496d760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x5bf6ab0, 0x2064b48, 0x5910ac8, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x4dd4c60, 0x53edf00, 0x5269fe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 7054 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*serviceConfigWatch).runWatch(0x5a2b200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/service_manager.go:180 +0xac
created by github.com/hashicorp/consul/agent.(*serviceConfigWatch).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/service_manager.go:158 +0x98

goroutine 7677 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4986c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4986c00, 0x5b48100)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6865 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x59894a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_PUT_ConflictingFlags(0x59894a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:432 +0x20
testing.tRunner(0x59894a0, 0x1d9d17c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7483 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x5a33bf0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7487 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x5a46200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46200, 0x5d32ce8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6848 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988a00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988a00, 0x1d9d160)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7543 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x53a45e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 6840 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988500, 0x1d9d150)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6845 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988820)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988820, 0x1d9d138)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7679 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4ea71c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7680 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4ea2900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 6884 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e0a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_ExecuteCached(0x598e0a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:617 +0x20
testing.tRunner(0x598e0a0, 0x1d9d238)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7620 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x594b7a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7485 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x5a46200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x5a46200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x5a46200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46200, 0x5d32cd8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6878 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_AutopilotCASConfiguration(0x5989cc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:347 +0x20
testing.tRunner(0x5989cc0, 0x1d9d198)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7539 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5a8fb20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 6869 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_RaftConfiguration(0x5989720)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:20 +0x20
testing.tRunner(0x5989720, 0x1d9d1b8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6846 [chan send, 4 minutes]:
testing.tRunner.func1(0x59888c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59888c0, 0x1d9d158)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6864 [chan send, 4 minutes]:
testing.tRunner.func1(0x5989400)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5989400, 0x1d9d174)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6892 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598e5a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec(0x598e5a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:101 +0x1c
testing.tRunner(0x598e5a0, 0x1d9d290)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6870 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x59897c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_RaftPeer(0x59897c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:46 +0x1c
testing.tRunner(0x59897c0, 0x1d9d1c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6872 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringList(0x5989900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:117 +0x20
testing.tRunner(0x5989900, 0x1d9d1a8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7654 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4a16c60, 0xbebc200, 0x0, 0x496b600, 0x53edd00, 0x5910a18)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 6862 [chan send, 4 minutes]:
testing.tRunner.func1(0x59892c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59892c0, 0x1d9d178)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7508 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e154, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3734, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5ac3720, 0x52ee000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5ac3720, 0x52ee000, 0x10000, 0x10000, 0x15c6b00, 0x5ac7201, 0x2073501, 0x0, 0x3f800000)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5d33050, 0x52ee000, 0x10000, 0x10000, 0x4df0734, 0x101, 0x4df0708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5d33050, 0x52ee000, 0x10000, 0x10000, 0x1, 0x5bfcd90, 0x15b0858, 0x1cc3fc8, 0x11)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x510cd40, 0x5d33050)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 6672 [chan send, 4 minutes]:
testing.tRunner.func1(0x5988000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5988000, 0x1d9d03c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7621 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x594b8f0, 0x558e000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x5d49bc0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x5d49bc0, 0x9ea5f00, 0x0, 0x594b8f4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x4e42cc0, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x58b3130)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x5d49bf0, 0x1845a70, 0x5c56300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x5d49bf0, 0x1845a70, 0x5c56300, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x5d49b90, 0x1845a70, 0x5c56300, 0x5c56300, 0x5b03548)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x5d49b90, 0x5c56300, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x5b03530, 0x2074928, 0x5d49b90, 0x4b14f90, 0x48a0400, 0x2074901, 0x0, 0x0, 0x1ba)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x5b03530, 0x2074928, 0x5d49b90, 0x9e6b3a0, 0x5ac3310, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9e6b3a0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x5b03530, 0x2074928, 0x5d49b90, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4bf5440, 0x2080c30, 0x594b8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 7561 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5104f30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 6873 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x59899a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringRemove(0x59899a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:165 +0x20
testing.tRunner(0x59899a0, 0x1d9d1ac)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6901 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x598eb40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHandleRemoteExecFailed(0x598eb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:335 +0x1c
testing.tRunner(0x598eb40, 0x1d9d0bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6841 [chan send, 4 minutes]:
testing.tRunner.func1(0x59885a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x59885a0, 0x1d9d128)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6874 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5989a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringUse(0x5989a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:223 +0x20
testing.tRunner(0x5989a40, 0x1d9d1b0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6739 [chan send, 3 minutes]:
testing.tRunner.func1(0x5cbd680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5cbd680, 0x1d9d304)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7670 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x4dd4c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 7176 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0ec80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate_NoCheck(0x4e0ec80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:190 +0x20
testing.tRunner(0x4e0ec80, 0x1d9d2bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7665 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e4f0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb4b4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5cfb4a0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5cfb4a0, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x51afc00, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x51afc00, 0x2, 0x2, 0x3f800000, 0x5c0aae0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5c46000, 0x206b9a8, 0x51afc00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 7627 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x58b3de0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 6968 [chan receive, 5 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5d83ec0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5c46b40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c46b40, 0x59f2040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5ce5a30, 0x5c46b40, 0x59f2040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7497 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x48db600, 0x4f59540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7631 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x51503f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7475 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52735e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionDeleteDestroy(0x52735e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:602 +0x20
testing.tRunner(0x52735e0, 0x1d9d2c8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7664 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x5c46000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 7500 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x5c4c900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7630 [select]:
github.com/hashicorp/yamux.(*Session).send(0x51503f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7632 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x5150460, 0x5378000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x507c750)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x507c750, 0x9f7e040, 0x0, 0x5150464)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5532860, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5c48100)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x507c780, 0x1845a70, 0x5a3b3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x507c780, 0x1845a70, 0x5a3b3c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x507c720, 0x1845a70, 0x5a3b3c0, 0x5a3b3c0, 0x51adb18)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x507c720, 0x5a3b3c0, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x51adb00, 0x2074928, 0x507c720, 0x566af90, 0x48a0464, 0x2074901, 0x507c720, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x51adb00, 0x2074928, 0x507c720, 0x9f68df0, 0x5c34320, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9f68df0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x51adb00, 0x2074928, 0x507c720, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4a7a900, 0x2080c30, 0x5150460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 7435 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x4ff36c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7519 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x5c4cea0, 0x1cad6ef, 0x5, 0x5c0d8a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7037 [chan receive, 5 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5a41780, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4a7a6c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a7a6c0, 0x591a980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5acccf0, 0x4a7a6c0, 0x591a980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7038 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x4b54cd0, 0x1cc0714, 0xf, 0x2052fd0, 0x52d3730, 0x1, 0x0, 0x1828b00, 0x590c400, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4b54cd0, 0x20732c8, 0x5ac0ba0, 0x1cc0714, 0xf, 0x2052fd0, 0x52d3730, 0x1cae1b2, 0x5, 0x590c340)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7655 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x54de000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7040 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x4b54cd0, 0x1cc0a02, 0xf, 0x2052fe8, 0x4efd8c0, 0x1, 0x0, 0x1828bc8, 0x4a8d2f0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4b54cd0, 0x20732c8, 0x5ac0ba0, 0x1cc0a02, 0xf, 0x2052fe8, 0x4efd8c0, 0x1cb690f, 0xa, 0x590c340)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7041 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x4b54cd0, 0x1cc096c, 0xf, 0x2053048, 0x48cf400, 0x4, 0x0, 0x1828b28, 0x51d8c00, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4b54cd0, 0x20732c8, 0x5ac0ba0, 0x1cc096c, 0xf, 0x2053048, 0x48cf400, 0x5cd5a20, 0x16, 0x590c340)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7542 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x5163600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5163600, 0x4970e68)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7179 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4e0f9a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCustomTTL(0x4e0f9a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:328 +0x1c
testing.tRunner(0x4e0f9a0, 0x1d9d2c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7078 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x590c500, 0x5c92d08, 0x20732c8, 0x5ac0f20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7077 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5ac0f20, 0x52dea0c, 0x20, 0x20, 0x52521c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5ac0e80, 0x20732c8, 0x5ac0f20, 0x5c92d08, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5ac0e80, 0x590c500, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4a7a6c0, 0x52d38ac, 0x590c4dc, 0x52deb70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x4b55e50, 0x52d3880, 0x590c4c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x549d098, 0x13, 0x1cac698, 0x4, 0x52ded4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x549d098, 0x13, 0x495054c, 0x3, 0x3, 0xfb370f01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5bbd7a0, 0x59797d0, 0x4dc92b8, 0x0, 0x4b55f40, 0x5a522c0, 0x1b9a5b0, 0x52d3880, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x59797d0, 0x2073568, 0x5ac0dc0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4a7a6c0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x52d3730, 0x1828b00, 0x590c480, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4ef29a0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x52d3730, 0x1828b00, 0x590c480, 0x5d55954, 0x4ba01e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x5c929d0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x5ac0da0, 0x2052fd0, 0x52d3730, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x5c929d0, 0x5cd57e0, 0x1828b00, 0x590c400, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 7629 [IO wait]:
internal/poll.runtime_pollWait(0xa531e574, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5d2c064, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5d2c050, 0x52ea000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5d2c050, 0x52ea000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5c92028, 0x52ea000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x507c690, 0x548a0b0, 0xc, 0xc, 0x5150460, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x507c690, 0x548a0b0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x51503f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x51503f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7557 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4d7b9e0, 0x1cad4b5, 0x5, 0x53a48c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7558 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4d7b9e0, 0x1cad6ef, 0x5, 0x53a48e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7515 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x5c4cea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7516 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x5c4cea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7488 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5ca1820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7559 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x535cd40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7551 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x5cee790, 0x2a05f200, 0x1, 0x4887c00, 0x5458080, 0x4971510)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7552 [select, 1 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x5cee790, 0x5458080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7553 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x5cee790, 0x1dcd6500, 0x0, 0x4887c40, 0x5458080, 0x4971528)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7554 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4d7b9e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7604 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x5d2c410)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7520 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4bf5440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 7549 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x5cee790)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7555 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4d7b9e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7449 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4bf5440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 7454 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a7a900, 0x54578c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x516a270, 0x4a7a900, 0x54578c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7464 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5c91220, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5c91220, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5c91220, 0x1cb2cd8, 0x8, 0x4e78fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5c91220)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7518 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x5c4cea0, 0x1cad4b5, 0x5, 0x5c0d880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7546 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e364, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c34d34, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c34d20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c34d20, 0x3, 0x3, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4bc7690, 0x594d074, 0x4994db4, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4bc7690, 0x12b94, 0x1ba48, 0x54d4930)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4887840, 0x4bc7690)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7493 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x48db600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7652 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4a16c60, 0x3b9aca00, 0x0, 0x496b5c0, 0x53edd00, 0x5910a08)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7547 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e0d0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c34d84, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5c34d70, 0x5406000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5c34d70, 0x5406000, 0x10000, 0x10000, 0x7d200, 0x5974a01, 0x1, 0x0, 0x7e701)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4970f98, 0x5406000, 0x10000, 0x10000, 0x59d2f34, 0x101, 0x59d2f08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4970f98, 0x5406000, 0x10000, 0x10000, 0x59d2f74, 0x4ef3440, 0x4ef343c, 0x16fe7f8, 0x4ef343c)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4887840, 0x4970f98)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7456 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x5c912c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7656 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x54de000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7562 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e67c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c34ec4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c34eb0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c34eb0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x52d8080, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x52d8080, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4887e40, 0x52d8080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7521 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x510cfc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7522 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x5b03440, 0x5a85b40, 0x1cac33a, 0x3, 0x4f59180, 0x5a85a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 7523 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4bf5440, 0x0, 0x1d9d4d8, 0x5c4c900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 7524 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4bf5440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 7525 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531ea18, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3464, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac3450, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac3450, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5a33c00, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x5a33c00, 0x2, 0x2, 0x3f800000, 0x523f3f8)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4bf5440, 0x206b9a8, 0x5a33c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 7526 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4bf5440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 7527 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4f9e5d0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x48c9e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 7528 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x48c9e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 7529 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x48c9e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 7530 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x48c9e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 7531 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e5f8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599c0b4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x599c0a0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x599c0a0, 0x4ba01e0, 0xb6d9936c, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x58b2160, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x58b2160, 0x5b900a0, 0x599c0a0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4ed2280, 0x206b9a8, 0x58b2160, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4ed2280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4f9e600, 0x1cac2e9, 0x3, 0x5d2a1e0, 0xf, 0x58b2150, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48c9e40, 0x4f9e600, 0x510d080, 0x510d0c0, 0x205db30, 0x5d2e620)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7532 [IO wait]:
internal/poll.runtime_pollWait(0xa4963bc8, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599c104, 0x72, 0xff00, 0xffff, 0x81718c0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x599c0f0, 0xa192000, 0xffff, 0xffff, 0x81718c0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x599c0f0, 0xa192000, 0xffff, 0xffff, 0x81718c0, 0x28, 0x28, 0xa4950658, 0x1d3a8, 0xa4950658, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5b900d0, 0xa192000, 0xffff, 0xffff, 0x81718c0, 0x28, 0x28, 0xb6d99008, 0x5554620, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5b900d0, 0xa192000, 0xffff, 0xffff, 0x81718c0, 0x28, 0x28, 0xb6d99008, 0x5554620, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5b900d0, 0xa192000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4ed2300, 0x5b900d0, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5b900f8, 0x5b900d0, 0x77359400, 0x0, 0xa0dc8d0, 0x1, 0x0, 0x0, 0x2054770, 0xa0dc8d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4ed2300, 0x5b900d0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4ed2300, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4f9e660, 0x1cac30d, 0x3, 0x5d2a230, 0xf, 0x58b2170, 0x510c090, 0x1)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48c9e40, 0x4f9e660, 0x510d080, 0x510d0c0, 0x205db48, 0x5d2e660)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7533 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e3e8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3874, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac3860, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac3860, 0x1b5c4c0, 0x4bb2b40, 0xb6d99000)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4edec60, 0x49, 0x18, 0x5a3a900)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4edec60, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4edec60, 0x20, 0x1b5c4c0, 0x310001, 0x5a3a900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x5cc2ea0, 0x2064b48, 0x5d333b8, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x48c9e40, 0x4f59c40, 0x5b15b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 7534 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4bf5440, 0x4f59d40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5d2a1c0, 0x4bf5440, 0x4f59d40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7563 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531ecac, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c34f14, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5c34f00, 0x5460000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5c34f00, 0x5460000, 0x10000, 0x10000, 0xfa771d00, 0x10001, 0x20001, 0x0, 0x3)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x49715c8, 0x5460000, 0x10000, 0x10000, 0x597e734, 0x101, 0x597e708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x49715c8, 0x5460000, 0x10000, 0x10000, 0x1, 0x9e7f200, 0x0, 0x1, 0x10000)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4887e40, 0x49715c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7564 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x5cee840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7565 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x5cee840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7566 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x5cee840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7567 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x5cee840, 0x3b9aca00, 0x0, 0x4904c00, 0x5458580, 0x4971968)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7568 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x5cee840, 0x5458580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7569 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x5cee840, 0xbebc200, 0x0, 0x4904d00, 0x5458580, 0x4971978)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7570 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4fa0b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7571 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4fa0b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7572 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa0b40, 0x1cae9dd, 0x6, 0x535cde0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7573 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa0b40, 0x1cad4b5, 0x5, 0x535ce60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7574 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa0b40, 0x1cad6ef, 0x5, 0x535ce80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7575 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4a7a900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 7576 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4904d40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7577 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x51ad9e0, 0x4d4c200, 0x1cac33a, 0x3, 0x5377bc0, 0x4d4c100)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 7578 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4a7a900, 0x0, 0x1d9d4d8, 0x4d7b9e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 7579 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4a7a900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 7580 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e1d8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c346f4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c346e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c346e0, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4e63cd0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4e63cd0, 0x2, 0x2, 0x3f800000, 0x5c92028)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4a7a900, 0x206b9a8, 0x4e63cd0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 7581 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4a7a900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 7582 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x490cb40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x5c66840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 7583 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x5c66840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 7584 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x5c66840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 7585 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x5c66840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 7586 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e700, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3a54, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac3a40, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac3a40, 0x48643c0, 0xb6d99008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x51e8230, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x51e8230, 0x5d33bc8, 0x5ac3a40, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x50cc980, 0x206b9a8, 0x51e8230, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x50cc980, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x490cb70, 0x1cac2e9, 0x3, 0x516a2a0, 0xf, 0x51e8220, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5c66840, 0x490cb70, 0x4905bc0, 0x4905e00, 0x205db30, 0x4d9f260)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7587 [IO wait]:
internal/poll.runtime_pollWait(0xa531ec28, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599c4c4, 0x72, 0xff00, 0xffff, 0x4e9bb00)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x599c4b0, 0x9e3a000, 0xffff, 0xffff, 0x4e9bb00, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x599c4b0, 0x9e3a000, 0xffff, 0xffff, 0x4e9bb00, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5b90338, 0x9e3a000, 0xffff, 0xffff, 0x4e9bb00, 0x28, 0x28, 0xb6d9936c, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5b90338, 0x9e3a000, 0xffff, 0xffff, 0x4e9bb00, 0x28, 0x28, 0xb6d9936c, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5b90338, 0x9e3a000, 0xffff, 0xffff, 0x61, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4ed2780, 0x5b90338, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5b90360, 0x5b90338, 0x77359400, 0x0, 0x9c2c660, 0x1, 0x0, 0x0, 0x2054770, 0x9c2c660)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4ed2780, 0x5b90338, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4ed2780, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x490cba0, 0x1cac30d, 0x3, 0x5d2a950, 0xf, 0x58b2850, 0x0, 0x208f5f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5c66840, 0x490cba0, 0x4905bc0, 0x4905e00, 0x205db48, 0x4d9f2a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7588 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e2e0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c35054, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c35040, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c35040, 0x1b5c4c0, 0x32b9d50, 0xb6d99600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x52d8a60, 0xc2, 0x18, 0x51fd820)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x52d8a60, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x52d8a60, 0x20, 0x1b5c4c0, 0x310001, 0x51fd820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x51053b0, 0x2064b48, 0x4971a28, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x5c66840, 0x5458700, 0x5359ba0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 7605 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x51afbf0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7606 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x507c3f0, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x5d3c200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7607 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4edc540, 0x1d9d490, 0x4bf5440, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4bf5440, 0x2080d50, 0x523f3f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4bf5440, 0x2080d50, 0x523f3f8, 0x1d9ef00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7608 [IO wait]:
internal/poll.runtime_pollWait(0xa4963b44, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb5f4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5cfb5e0, 0x5541000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5cfb5e0, 0x5541000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x523f3f8, 0x5541000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x507ca80, 0x5af6d10, 0xc, 0xc, 0x594b8f0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x507ca80, 0x5af6d10, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4edc540, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4edc540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7609 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4edc540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7610 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4edc540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7589 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x5a46a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x5a46a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x5a46a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46a00, 0x4d9c668)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7590 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x5a46a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46a00, 0x4d9c670)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7591 [select, 1 minutes]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x5a46a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46a00, 0x4d9c678)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7592 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x51b81a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7611 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4ea39e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7612 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4ea39e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7613 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531eebc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb6e4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5cfb6d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5cfb6d0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x535ff90, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x535ff90, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x514b440, 0x535ff90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7614 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531e46c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb734, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5cfb720, 0x55e8000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5cfb720, 0x55e8000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x523f580, 0x55e8000, 0x10000, 0x10000, 0x55dd734, 0x101, 0x55dd708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x523f580, 0x55e8000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x514b440, 0x523f580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7615 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4da4370)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7616 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4da4370)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7617 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4da4370)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7634 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da4370, 0x2a05f200, 0x1, 0x514bb00, 0x5530000, 0x523f828)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7635 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4da4370, 0x5530000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7636 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da4370, 0x1dcd6500, 0x0, 0x514bb40, 0x5530000, 0x523f838)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7637 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4fa17a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7638 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4fa17a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7639 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa17a0, 0x1cae9dd, 0x6, 0x4d20700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7640 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa17a0, 0x1cad4b5, 0x5, 0x4d20720)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7641 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa17a0, 0x1cad6ef, 0x5, 0x4d20740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7642 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5243760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 8059 [IO wait]:
internal/poll.runtime_pollWait(0xa532a02c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5a61004, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5a60ff0, 0x5627000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5a60ff0, 0x5627000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5910b68, 0x5627000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5076a50, 0x54cdee0, 0xc, 0xc, 0x4ff2cb0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5076a50, 0x54cdee0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4ff2c40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4ff2c40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7597 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c46000, 0x5882f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x497da00, 0x5c46000, 0x5882f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7595 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5d2c3c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5d2c3c0, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5d2c3c0, 0x1cb2cd8, 0x8, 0x557efdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5d2c3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7762 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked.func1(0x5c2ad20, 0x52dcf00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:195 +0x68
created by github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:193 +0x164

goroutine 7683 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963934, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x506f914, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x506f900, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x506f900, 0x1b5c4c0, 0x48f6780, 0xb6d99000)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x593e4c0, 0x11, 0x18, 0x4cd8200)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x593e4c0, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x593e4c0, 0x20, 0x1b5c4c0, 0x310001, 0x4cd8200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x5ba27e0, 0x2064b48, 0x5c0a2a0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x4ef2b00, 0x54e3d40, 0x5503780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 7646 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x547a140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7648 [IO wait]:
internal/poll.runtime_pollWait(0xa531eba4, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x506f824, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x506f810, 0x52b7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x506f810, 0x52b7000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5c0a1d8, 0x52b7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5c6e5d0, 0x5cf0350, 0xc, 0xc, 0x5355b20, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5c6e5d0, 0x5cf0350, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x5355a40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x5355a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7649 [select]:
github.com/hashicorp/yamux.(*Session).send(0x5355a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7682 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x5355a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7633 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x5c31f80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7698 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5c31f80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7727 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4fa7de0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7728 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4ea3950)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7729 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4ea3950)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7730 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531efc4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3a04, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac39f0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac39f0, 0x112a24, 0x5e0a4778, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x50e89c0, 0x1698a9dc, 0x506f770, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x50e89c0, 0x5ce4164, 0x7229c, 0x4beb860)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x514b2c0, 0x50e89c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7731 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963ac0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3aa4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5ac3a90, 0x54ae000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5ac3a90, 0x54ae000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5b48480, 0x54ae000, 0x10000, 0x10000, 0x560c734, 0x101, 0x560c708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5b48480, 0x54ae000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x514b2c0, 0x5b48480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7732 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4ccc8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7733 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4ccc8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7734 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4ccc8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7735 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccc8f0, 0x3b9aca00, 0x0, 0x514b900, 0x5458e00, 0x5b48728)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7736 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4ccc8f0, 0x5458e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7737 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccc8f0, 0xbebc200, 0x0, 0x514b940, 0x5458e00, 0x5b48738)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7738 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x53ab200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7739 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x53ab200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7740 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x53ab200, 0x1cae9dd, 0x6, 0x4fbe4a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7741 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x53ab200, 0x1cad4b5, 0x5, 0x4fbf060)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7742 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x53ab200, 0x1cad6ef, 0x5, 0x4fbf0a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7743 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x5a51680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 7744 [select, 1 minutes]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x514bc40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7745 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x50f8ba0, 0x5997ec0, 0x1cac33a, 0x3, 0x5458040, 0x5997e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 7746 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x5a51680, 0x0, 0x1d9d4d8, 0x53aa000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 7747 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x5a51680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 7748 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa531ed30, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bce474, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5bce460, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5bce460, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x58b3df0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x58b3df0, 0x2, 0x2, 0x3f800000, 0x4bc93a0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5a51680, 0x206b9a8, 0x58b3df0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 7749 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x5a51680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 7750 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x5c2ad20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x4ef2b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 7751 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x4ef2b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 7752 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x4ef2b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 7753 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x4ef2b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 7754 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa49639b8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x506f8c4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x506f8b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x506f8b0, 0x48f6780, 0xb6d99008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x593e4b0, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x593e4b0, 0x5c0a280, 0x506f8b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x5d3c100, 0x206b9a8, 0x593e4b0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x5d3c100, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5c2ad50, 0x1cac2e9, 0x3, 0x5cf04c0, 0xf, 0x593e4a0, 0x1923500, 0x5114fec)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4ef2b00, 0x5c2ad50, 0x4b580c0, 0x4b58140, 0x205db30, 0x5a41120)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7755 [IO wait]:
internal/poll.runtime_pollWait(0xa4963a3c, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3be4, 0x72, 0xff00, 0xffff, 0x81702d0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5ac3bd0, 0x9e4a000, 0xffff, 0xffff, 0x81702d0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5ac3bd0, 0x9e4a000, 0xffff, 0xffff, 0x81702d0, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5b487e8, 0x9e4a000, 0xffff, 0xffff, 0x81702d0, 0x28, 0x28, 0xb6d99008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5b487e8, 0x9e4a000, 0xffff, 0xffff, 0x81702d0, 0x28, 0x28, 0xb6d99008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5b487e8, 0x9e4a000, 0xffff, 0xffff, 0x61, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4d24600, 0x5b487e8, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5b48810, 0x5b487e8, 0x77359400, 0x0, 0x9b3d830, 0x1, 0x0, 0x0, 0x2054770, 0x9b3d830)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4d24600, 0x5b487e8, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4d24600, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5c2ad80, 0x1cac30d, 0x3, 0x4bd4590, 0xf, 0x535e2f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4ef2b00, 0x5c2ad80, 0x4b580c0, 0x4b58140, 0x205db48, 0x5a41180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7599 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5bfa370, 0x20732c8, 0x5bdf2c0, 0x1cc0705, 0xf, 0x2052e80, 0x5bd12f0, 0x1cacd60, 0x4, 0x52dce80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:122 +0x2ac
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7598 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x5bfa370, 0x1cc0714, 0xf, 0x2052fd0, 0x50965b0, 0x9, 0x0, 0x1828b00, 0x52dd100, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5bfa370, 0x20732c8, 0x5bdf2c0, 0x1cc0714, 0xf, 0x2052fd0, 0x50965b0, 0x1cae1b2, 0x5, 0x52dce80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7758 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5bfa320, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5bfa320, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5bfa320, 0x1cb2cd8, 0x8, 0x5668fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5bfa320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7693 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x51069a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7760 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x5c46000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 7761 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a51680, 0x5456300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5cf04a0, 0x5a51680, 0x5456300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7862 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5d0e340, 0x4bc8fe0, 0x20732c8, 0x5b2c960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7772 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x52dd380, 0x5b8e5b0, 0x20732c8, 0x5bdfd80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 8147 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x5a47200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x5a47200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x5a47200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a47200, 0x50edaf0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7771 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5bdfd80, 0x53e79f4, 0x20, 0x20, 0x1, 0x5bc0790)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5bdfca0, 0x20732c8, 0x5bdfd80, 0x5b8e5b0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5bdfca0, 0x52dd380, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5a51680, 0x4fb4198, 0x5bd174c, 0x53e7b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x5b910f0, 0x4fb4180, 0x5bd1740, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x5b91140, 0x13, 0x1cac698, 0x4, 0x53e7d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x5b91140, 0x13, 0x482f54c, 0x3, 0x3, 0xc8cc3f01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5a52ae0, 0x50f8d20, 0x5418a58, 0x0, 0x5bfb9a0, 0x5bdfa20, 0x1b47298, 0x4fb4180, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x50f8d20, 0x2073568, 0x5bdfc40, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5a51680, 0x1cbfad5, 0xf, 0x1b47298, 0x4fb4060, 0x1828bc8, 0x5bd1710, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4ef2b00, 0x1cbfad5, 0xf, 0x1b47298, 0x4fb4060, 0x1828bc8, 0x5bd1710, 0x5bd16e4, 0x48654a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x5b48778, 0x1, 0x0, 0xb2c97000, 0x8b, 0x5bdfc20, 0x2052fe8, 0x4fb4060, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x5b48778, 0x5a7f5c0, 0x1828bc8, 0x5bd1650, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 7786 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4edd8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7865 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x5bfa3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7787 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4edd8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7684 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x5a221c0, 0x1d9d490, 0x5c46000, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x5c46000, 0x2080d50, 0x5c0aae0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x5c46000, 0x2080d50, 0x5c0aae0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7785 [IO wait]:
internal/poll.runtime_pollWait(0xa496382c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb644, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5cfb630, 0x5312000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5cfb630, 0x5312000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5b48c38, 0x5312000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4e4e5a0, 0x4df8f80, 0xc, 0xc, 0x4edd9d0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x4e4e5a0, 0x4df8f80, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4edd8f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4edd8f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7783 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x5d2c460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7704 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x4bfcb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7705 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x52a2df0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7706 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x496fe90, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x50ccd80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7685 [IO wait]:
internal/poll.runtime_pollWait(0xa49637a8, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x506faa4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x506fa90, 0x5293000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x506fa90, 0x5293000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5c0aae0, 0x5293000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x59f48a0, 0x5cf0bd4, 0xc, 0xc, 0x5a22230, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x59f48a0, 0x5cf0bd4, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x5a221c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x5a221c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7686 [select]:
github.com/hashicorp/yamux.(*Session).send(0x5a221c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7687 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x5a221c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7688 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x5a22230, 0x5330000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x59f4930)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x59f4930, 0x9f81380, 0x0, 0x5a22234)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5147b20, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x593f7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x59f4960, 0x1845a70, 0x51b8960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x59f4960, 0x1845a70, 0x51b8960, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x59f4900, 0x1845a70, 0x51b8960, 0x51b8960, 0x507c3a8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x59f4900, 0x51b8960, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x507c390, 0x2074928, 0x59f4900, 0x566ef90, 0x48a0464, 0x2074901, 0x59f4900, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x507c390, 0x2074928, 0x59f4900, 0x9e6bff0, 0x5cfb360, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9e6bff0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x507c390, 0x2074928, 0x59f4900, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x5c46000, 0x2080c30, 0x5a22230)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 7707 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x4edc620)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7708 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x4edc620)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7795 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x50aa400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x50aa400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x50aa400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x50aa400, 0x5b8e9d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7796 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x50aa400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x50aa400, 0x5b8e9d8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7797 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x50aa400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x50aa400, 0x5b8e9e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7798 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x514c3e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7799 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5a92e10)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7800 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5a92e10)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7801 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963724, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599c974, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x599c960, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x599c960, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5bc1240, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5bc1240, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5a84640, 0x5bc1240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7802 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa49636a0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599c9c4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x599c9b0, 0x5726000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x599c9b0, 0x5726000, 0x10000, 0x10000, 0x12d00, 0x20001, 0x30001, 0x0, 0x20000)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5b8ea50, 0x5726000, 0x10000, 0x10000, 0x50c1734, 0x101, 0x50c1708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5b8ea50, 0x5726000, 0x10000, 0x10000, 0x50c1728, 0x60a910, 0x0, 0x0, 0x50c1770)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5a84640, 0x5b8ea50)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7803 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4da44d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7804 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4da44d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7805 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4da44d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7806 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da44d0, 0x2a05f200, 0x1, 0x5a847c0, 0x5070400, 0x5b8ecf8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7807 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4da44d0, 0x5070400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7808 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da44d0, 0x1dcd6500, 0x0, 0x5a84800, 0x5070400, 0x5b8ed08)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7809 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4f0f8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7810 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4f0f8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7811 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4f0f8c0, 0x1cae9dd, 0x6, 0x514c4c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7812 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4f0f8c0, 0x1cad4b5, 0x5, 0x514c4e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7813 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4f0f8c0, 0x1cad6ef, 0x5, 0x514c500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7814 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x545fa80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7815 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5a933b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7816 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5a933b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7817 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa496361c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599cb04, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x599caf0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x599caf0, 0x18476e0, 0x19f1b0d, 0x43)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5bc1a00, 0x18476e0, 0x56bcf94, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5bc1a00, 0x208f5f0, 0x18476e0, 0x19d5fb4)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5a84a00, 0x5bc1a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7818 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963598, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x599cb54, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x599cb40, 0x5736000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x599cb40, 0x5736000, 0x10000, 0x10000, 0x0, 0x548ba01, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5b8ed68, 0x5736000, 0x10000, 0x10000, 0x56c1734, 0x101, 0x56c1708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5b8ed68, 0x5736000, 0x10000, 0x10000, 0x19cccb1, 0x3d, 0xe4, 0x548ba2c, 0x1)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5a84a00, 0x5b8ed68)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7819 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4da4580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7820 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4da4580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7821 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4da4580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7822 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da4580, 0x3b9aca00, 0x0, 0x5a84d00, 0x5070d80, 0x5b8f030)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7823 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4da4580, 0x5070d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7824 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da4580, 0xbebc200, 0x0, 0x5a84d40, 0x5070d80, 0x5b8f040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7825 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x539a5a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7826 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x539a5a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7827 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x539a5a0, 0x1cae9dd, 0x6, 0x545fba0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7828 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x539a5a0, 0x1cad4b5, 0x5, 0x545fbc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7829 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x539a5a0, 0x1cad6ef, 0x5, 0x545fbe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7830 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4bf4fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 7831 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5a84d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7832 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x496fd40, 0x5140e00, 0x1cac33a, 0x3, 0x5357dc0, 0x5140d40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 7833 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4bf4fc0, 0x0, 0x1d9d4d8, 0x4f0f8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 7834 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4bf4fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 7835 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa49638b0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcf0a4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5bcf090, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5bcf090, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x52a33a0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x52a33a0, 0x2, 0x2, 0x3f800000, 0x4e245a8)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4bf4fc0, 0x206b9a8, 0x52a33a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 7836 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4bf4fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 7837 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x490ce10, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x4ef2c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 7838 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x4ef2c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 7839 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x4ef2c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 7840 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x4ef2c60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 7841 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963490, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb8c4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5cfb8b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5cfb8b0, 0x4bb2000, 0xb6d99a34, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x535f340, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x535f340, 0x5b48e50, 0x5cfb8b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4d24a80, 0x206b9a8, 0x535f340, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4d24a80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x490ce40, 0x1cac2e9, 0x3, 0x4df94e0, 0xf, 0x535f330, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4ef2c60, 0x490ce40, 0x5a84e40, 0x5a85000, 0x205db30, 0x4d89b60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7842 [IO wait]:
internal/poll.runtime_pollWait(0xa4963514, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c34834, 0x72, 0xff00, 0xffff, 0x80ecea0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5c34820, 0x9e70000, 0xffff, 0xffff, 0x80ecea0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5c34820, 0x9e70000, 0xffff, 0xffff, 0x80ecea0, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5b8f0f0, 0x9e70000, 0xffff, 0xffff, 0x80ecea0, 0x28, 0x28, 0xb6d99a34, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5b8f0f0, 0x9e70000, 0xffff, 0xffff, 0x80ecea0, 0x28, 0x28, 0xb6d99a34, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5b8f0f0, 0x9e70000, 0xffff, 0xffff, 0x61, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x5370880, 0x5b8f0f0, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5b8f118, 0x5b8f0f0, 0x77359400, 0x0, 0x9c17800, 0x1, 0x0, 0x0, 0x2054770, 0x9c17800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x5370880, 0x5b8f0f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x5370880, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x490ce70, 0x1cac30d, 0x3, 0x5118a40, 0xf, 0x4e62300, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4ef2c60, 0x490ce70, 0x5a84e40, 0x5a85000, 0x205db48, 0x4d89ba0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7709 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa496340c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcf284, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5bcf270, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5bcf270, 0x1b5c4c0, 0x4ba01e0, 0xb6d99300)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5164480, 0x8e, 0x18, 0x5b151a0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5164480, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x5164480, 0x20, 0x1b5c4c0, 0x310001, 0x5b151a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x48a0510, 0x2064b48, 0x5075ba8, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x4ef2c60, 0x5361bc0, 0x5958060)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 8052 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x5506620)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 8058 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4ff2c40, 0x1d9d490, 0x5a44480, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x5a44480, 0x2080d50, 0x5910b68)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x5a44480, 0x2080d50, 0x5910b68, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7712 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x4bfcaf0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x4bfcaf0, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x4bfcaf0, 0x1cb2cd8, 0x8, 0x557dfdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x4bfcaf0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7861 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5b2c960, 0x53e5a0c, 0x20, 0x20, 0x4bc8fd8, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5b2c8c0, 0x20732c8, 0x5b2c960, 0x4bc8fe0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5b2c8c0, 0x5d0e340, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5a51680, 0x5c3095c, 0x5d0e31c, 0x53e5b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x5bfb590, 0x5c30930, 0x5d0e300, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x5b91060, 0x13, 0x1cac698, 0x4, 0x53e5d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x5b91060, 0x13, 0x56dd54c, 0x3, 0x3, 0x6d6dd901, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5a52940, 0x50f8d20, 0x5af6058, 0x0, 0x5bfb680, 0x5244420, 0x1b9a5b0, 0x5c30930, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x50f8d20, 0x2073568, 0x5b2c800, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5a51680, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x50965b0, 0x1828b00, 0x5d0e2c0, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4ef2b00, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x50965b0, 0x1828b00, 0x5d0e2c0, 0x51ad744, 0x48f61e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x5b48770, 0x9, 0x0, 0xb2c97000, 0x8b, 0x5b2c7e0, 0x2052fd0, 0x50965b0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x5b48770, 0x5a7f580, 0x1828b00, 0x52dd100, 0x0, 0x0, 0x0, 0x0, 0x9, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 7788 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4bf4fc0, 0x501b300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4df94c0, 0x4bf4fc0, 0x501b300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8109 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4f30d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7860 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x5a51680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 7856 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5bfa000, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5bfa000, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5bfa000, 0x1cb2cd8, 0x8, 0x5895fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5bfa000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7875 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x577e2d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7876 [runnable]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x480f510)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7877 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5a1dc80, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4ed2d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7879 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x5a46800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x5a46800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x5a46800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46800, 0x490b988)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7902 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x517abd0, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x5c6a900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7694 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x51069a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7867 [IO wait]:
internal/poll.runtime_pollWait(0xa4963304, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcf914, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5bcf900, 0x57ad000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5bcf900, 0x57ad000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4bc9380, 0x57ad000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5c62540, 0x5af6b10, 0xc, 0xc, 0x57b0380, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5c62540, 0x5af6b10, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x57b02a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x57b02a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7868 [select]:
github.com/hashicorp/yamux.(*Session).send(0x57b02a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7869 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x57b02a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7870 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x57b03f0, 0x1d9d490, 0x5a51680, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x5a51680, 0x2080d50, 0x4bc93a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x5a51680, 0x2080d50, 0x4bc93a0, 0x1a73300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7871 [IO wait]:
internal/poll.runtime_pollWait(0xa4963280, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcf9b4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5bcf9a0, 0x57d6000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5bcf9a0, 0x57d6000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4bc93a0, 0x57d6000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5c62690, 0x5af6b40, 0xc, 0xc, 0x57b0460, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5c62690, 0x5af6b40, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x57b03f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x57b03f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7872 [select]:
github.com/hashicorp/yamux.(*Session).send(0x57b03f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7873 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x57b03f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7890 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x57b0460, 0x57d7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x5c627b0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x5c627b0, 0xa025560, 0x0, 0x57b0464)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5b59360, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5a32200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x5c62840, 0x1845a70, 0x5bdee80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x5c62840, 0x1845a70, 0x5bdee80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x5c62780, 0x1845a70, 0x5bdee80, 0x5bdee80, 0x50f8d38)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x5c62780, 0x5bdee80, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x50f8d20, 0x2074928, 0x5c62780, 0x5751f90, 0x48a0400, 0x2074901, 0x0, 0x0, 0xe1)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x50f8d20, 0x2074928, 0x5c62780, 0xa030250, 0x5bce320, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0xa030250, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x50f8d20, 0x2074928, 0x5c62780, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x5a51680, 0x2080c30, 0x57b0460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 7880 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x5a46800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46800, 0x490b990)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7881 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x5a46800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a46800, 0x490b998)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7882 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5bbd400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7883 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x518a7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7884 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x518a7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7885 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa49631fc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a2794, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57a2780, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57a2780, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x58f4870, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x58f4870, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5834100, 0x58f4870)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7886 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963178, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a27e4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x57a27d0, 0x5862000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x57a27d0, 0x5862000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x490ba08, 0x5862000, 0x10000, 0x10000, 0x5792734, 0x101, 0x5792708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x490ba08, 0x5862000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5834100, 0x490ba08)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7887 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4ccd6b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7888 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4ccd6b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7889 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4ccd6b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7906 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccd6b0, 0x2a05f200, 0x1, 0x5834280, 0x4d22880, 0x490bc90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7907 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4ccd6b0, 0x4d22880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7908 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccd6b0, 0x1dcd6500, 0x0, 0x58342c0, 0x4d22880, 0x490bca0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7909 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x54df320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7910 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x54df320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7911 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54df320, 0x1cae9dd, 0x6, 0x5bbd500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7912 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54df320, 0x1cad4b5, 0x5, 0x5bbd520)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7913 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54df320, 0x1cad6ef, 0x5, 0x5bbd540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7914 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5cdf720)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7915 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x518b5f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7916 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x518b5f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7917 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa49630f4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a2924, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57a2910, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57a2910, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x58f5060, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x58f5060, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5834480, 0x58f5060)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7918 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963070, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a2974, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x57a2960, 0x5872000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x57a2960, 0x5872000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x490bd00, 0x5872000, 0x10000, 0x10000, 0x5842734, 0x101, 0x5842708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x490bd00, 0x5872000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5834480, 0x490bd00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7919 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4ccd760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7920 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4ccd760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7921 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4ccd760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7850 [IO wait]:
internal/poll.runtime_pollWait(0xa532a5d8, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac3e64, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5ac3e50, 0x5292000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5ac3e50, 0x5292000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x595c8a8, 0x5292000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5cadb60, 0x5c14180, 0xc, 0xc, 0x52d2fc0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5cadb60, 0x5c14180, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x52d5ab0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x52d5ab0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7922 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccd760, 0x3b9aca00, 0x0, 0x5834600, 0x4d22d80, 0x490bfa8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7923 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4ccd760, 0x4d22d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7924 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccd760, 0xbebc200, 0x0, 0x5834640, 0x4d22d80, 0x490bfb8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7925 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x54df9e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7926 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x54df9e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7927 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54df9e0, 0x1cae9dd, 0x6, 0x5cdf7c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7928 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54df9e0, 0x1cad4b5, 0x5, 0x5cdf7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7929 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54df9e0, 0x1cad6ef, 0x5, 0x5cdf800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7930 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4a7ab40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 7931 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5834680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7932 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x5a1db30, 0x5d44780, 0x1cac33a, 0x3, 0x4d22480, 0x5d44680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 7933 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4a7ab40, 0x0, 0x1d9d4d8, 0x54df320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 7934 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4a7ab40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 7935 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4963388, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a2334, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57a2320, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57a2320, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x480f520, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x480f520, 0x2, 0x2, 0x3f800000, 0x595c8a8)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4a7ab40, 0x206b9a8, 0x480f520)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 7936 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4a7ab40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 7937 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x5b04ba0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x4dd4dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 7938 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x4dd4dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 7939 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x4dd4dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 7940 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x4dd4dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 7695 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4962f68, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x58261f4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x58261e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x58261e0, 0x4bb23c0, 0xb6d99008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5b20020, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x5b20020, 0x5c0b698, 0x58261e0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x5d3cf00, 0x206b9a8, 0x5b20020, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x5d3cf00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5b03890, 0x1cac2e9, 0x3, 0x5cf1f80, 0xf, 0x5b20010, 0x1, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4dd4dc0, 0x5b03890, 0x57e3780, 0x57e37c0, 0x205db30, 0x5d2e460)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7696 [IO wait]:
internal/poll.runtime_pollWait(0xa4962fec, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x58261a4, 0x72, 0xff00, 0xffff, 0x8170cc0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5826190, 0x9ef0000, 0xffff, 0xffff, 0x8170cc0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5826190, 0x9ef0000, 0xffff, 0xffff, 0x8170cc0, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5c0b620, 0x9ef0000, 0xffff, 0xffff, 0x8170cc0, 0x28, 0x28, 0xb6d99008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5c0b620, 0x9ef0000, 0xffff, 0xffff, 0x8170cc0, 0x28, 0x28, 0xb6d99008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5c0b620, 0x9ef0000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x5d3ce80, 0x5c0b620, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5c0b648, 0x5c0b620, 0x77359400, 0x0, 0x9ed4630, 0x1, 0x0, 0x0, 0x2054770, 0x9ed4630)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x5d3ce80, 0x5c0b620, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x5d3ce80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5b038c0, 0x1cac30d, 0x3, 0x5cf1f00, 0xf, 0x5c39fd0, 0x510c090, 0x1)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4dd4dc0, 0x5b038c0, 0x57e3780, 0x57e37c0, 0x205db48, 0x5d2e4a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 7697 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4962ee4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5826244, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5826230, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5826230, 0x1b5c4c0, 0x4bb23c0, 0xb6d99000)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5b20030, 0xba, 0x18, 0x5d4b720)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5b20030, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x5b20030, 0x20, 0x1b5c4c0, 0x310001, 0x5d4b720)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x5ba38c0, 0x2064b48, 0x5c0b6b8, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x4dd4dc0, 0x4d8f880, 0x5d20440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 7943 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4cf75e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 8060 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4ff2c40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7956 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x577e280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x577e280, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x577e280, 0x1cb2cd8, 0x8, 0x566ffdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x577e280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7852 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x52d5ab0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7941 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4bf4fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 7952 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x4cf6a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7962 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x4bfcbe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7942 [IO wait]:
internal/poll.runtime_pollWait(0xa4962e60, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x58266a4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5826690, 0x58e0000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5826690, 0x58e0000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4e24588, 0x58e0000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5d0c720, 0x5b1b5a0, 0xc, 0xc, 0x4cf76c0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5d0c720, 0x5b1b5a0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4cf75e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4cf75e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7944 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4cf75e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7945 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4cf7730, 0x1d9d490, 0x4bf4fc0, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4bf4fc0, 0x2080d50, 0x4e245a8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4bf4fc0, 0x2080d50, 0x4e245a8, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7946 [IO wait]:
internal/poll.runtime_pollWait(0xa4962ddc, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a2e24, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x57a2e10, 0x58e3000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x57a2e10, 0x58e3000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4e245a8, 0x58e3000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5d0c8a0, 0x5b1b5d0, 0xc, 0xc, 0x4cf77a0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5d0c8a0, 0x5b1b5d0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4cf7730, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4cf7730)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7947 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4cf7730)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7948 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4cf7730)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7949 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4cf77a0, 0x58e8000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x5d0c930)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x5d0c930, 0x9fdfcc0, 0x0, 0x4cf77a4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5bec100, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5cb07a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x5d0c960, 0x1845a70, 0x5b15280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x5d0c960, 0x1845a70, 0x5b15280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x5d0c900, 0x1845a70, 0x5b15280, 0x5b15280, 0x496fe48)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x5d0c900, 0x5b15280, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x496fe30, 0x2074928, 0x5d0c900, 0x5671f90, 0x48a0400, 0x2074901, 0x0, 0x0, 0x97)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x496fe30, 0x2074928, 0x5d0c900, 0x9f4fd60, 0x5bcef50, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9f4fd60, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x496fe30, 0x2074928, 0x5d0c900, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4bf4fc0, 0x2080c30, 0x4cf77a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 7964 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a7ab40, 0x50e0ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5af6eb0, 0x4a7ab40, 0x50e0ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7851 [select]:
github.com/hashicorp/yamux.(*Session).send(0x52d5ab0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7900 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x5bfa050)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 8046 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5aa9710)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7848 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a6e0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfb7d4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5cfb7c0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5cfb7c0, 0x1b5c4c0, 0x48f61e0, 0xb6d99600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x58b2500, 0x3d, 0x18, 0x5ca0780)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x58b2500, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x58b2500, 0x20, 0x1b5c4c0, 0x310001, 0x5ca0780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x4dc47e0, 0x2064b48, 0x5b90370, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x5c669a0, 0x58d6cc0, 0x5a53d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 7969 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x5826eb0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7970 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x5a192c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7971 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5a4c060, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x491eb00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7901 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x5252470)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7953 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x4cf6a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7974 [runnable]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x48dc200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x48dc200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x48dc200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x48dc200, 0x595c030)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7975 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x48dc200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x48dc200, 0x595c038)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7976 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x48dc200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x48dc200, 0x595c040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7977 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x59846e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7978 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5aa8240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7979 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5aa8240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7980 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4962cd4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c90d34, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c90d20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c90d20, 0x4cf77a0, 0x0, 0xdef544)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x59621d0, 0x496fe30, 0x2074928, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x59621d0, 0xc, 0x5190000, 0x589ef70)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x54dadc0, 0x59621d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7981 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4962c50, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c90d84, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5c90d70, 0x5614000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5c90d70, 0x5614000, 0x10000, 0x10000, 0x490b200, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x595c0b0, 0x5614000, 0x10000, 0x10000, 0x589f734, 0x101, 0x589f708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x595c0b0, 0x5614000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x54dadc0, 0x595c0b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7982 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4cccdc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7983 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4cccdc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7984 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4cccdc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7985 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4cccdc0, 0x2a05f200, 0x1, 0x54db080, 0x54eab80, 0x595c338)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8002 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4cccdc0, 0x54eab80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 8003 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4cccdc0, 0x1dcd6500, 0x0, 0x54db0c0, 0x54eab80, 0x595c348)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8004 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4fa0ea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 8005 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4fa0ea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8006 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa0ea0, 0x1cae9dd, 0x6, 0x5984780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 8007 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa0ea0, 0x1cad4b5, 0x5, 0x59847a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8008 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa0ea0, 0x1cad6ef, 0x5, 0x59847c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 8009 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5ac1320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 8010 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5aa86c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 8011 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5aa86c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 8012 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a8f0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c90f64, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5c90f50, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5c90f50, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x59629e0, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x59629e0, 0x0, 0x4e42840, 0x4df8710)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x54db280, 0x59629e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 8013 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a86c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c91004, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5c90ff0, 0x5632000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5c90ff0, 0x5632000, 0x10000, 0x10000, 0xe8400, 0x4864301, 0xb6d99601, 0x0, 0x174e00c)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x595c3a8, 0x5632000, 0x10000, 0x10000, 0x5849734, 0x101, 0x5849708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x595c3a8, 0x5632000, 0x10000, 0x10000, 0x5e0a4785, 0x0, 0x1dd9d3b, 0x4985590, 0x1801ca)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x54db280, 0x595c3a8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 8014 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4ccce70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 8015 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4ccce70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 8016 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4ccce70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8017 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccce70, 0x3b9aca00, 0x0, 0x54db440, 0x54eb680, 0x595c650)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8018 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4ccce70, 0x54eb680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 8019 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccce70, 0xbebc200, 0x0, 0x54db480, 0x54eb680, 0x595c660)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8020 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4fa1e60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 8021 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4fa1e60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8022 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa1e60, 0x1cae9dd, 0x6, 0x5ac13c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 8023 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa1e60, 0x1cad4b5, 0x5, 0x5ac13e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8024 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4fa1e60, 0x1cad6ef, 0x5, 0x5ac1400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 8025 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x5a44480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 8026 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x54db4c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 8027 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x59c5ef0, 0x5a31040, 0x1cac33a, 0x3, 0x54ea0c0, 0x5a30fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 8028 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x5a44480, 0x0, 0x1d9d4d8, 0x4fa0ea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 8029 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x5a44480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 8030 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa4962d58, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5a60f14, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5a60f00, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5a60f00, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5a192d0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x5a192d0, 0x2, 0x2, 0x3f800000, 0x5910b68)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5a44480, 0x206b9a8, 0x5a192d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 8031 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x5a44480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 8032 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x5d5edb0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x5c669a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 8033 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x5c669a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 8034 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x5c669a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 8035 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x5c669a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 8036 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a764, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5ac34b4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5ac34a0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5ac34a0, 0x48643c0, 0xb6d99008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5963280, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x5963280, 0x595c788, 0x5ac34a0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x48b6b80, 0x206b9a8, 0x5963280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x48b6b80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5d5ede0, 0x1cac2e9, 0x3, 0x5ce4870, 0xf, 0x5963270, 0x0, 0x4fedd88)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5c669a0, 0x5d5ede0, 0x54db580, 0x54db5c0, 0x205db30, 0x5c4fc60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 8037 [IO wait]:
internal/poll.runtime_pollWait(0xa532a7e8, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5c91144, 0x72, 0xff00, 0xffff, 0x81707e0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5c91130, 0x9e90000, 0xffff, 0xffff, 0x81707e0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5c91130, 0x9e90000, 0xffff, 0xffff, 0x81707e0, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x595c710, 0x9e90000, 0xffff, 0xffff, 0x81707e0, 0x28, 0x28, 0xb6d99008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x595c710, 0x9e90000, 0xffff, 0xffff, 0x81707e0, 0x28, 0x28, 0xb6d99008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x595c710, 0x9e90000, 0xffff, 0xffff, 0x61, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x48b6a80, 0x595c710, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x595c738, 0x595c710, 0x77359400, 0x0, 0x9e66d80, 0x1, 0x0, 0x0, 0x2054770, 0x9e66d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x48b6a80, 0x595c710, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x48b6a80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5d5ee10, 0x1cac30d, 0x3, 0x5ce47e0, 0xf, 0x5963230, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5c669a0, 0x5d5ee10, 0x54db580, 0x54db5c0, 0x205db48, 0x5c4fca0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 8054 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x5a44480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 7989 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5826e60, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5826e60, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5826e60, 0x1cb2cd8, 0x8, 0x5676fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5826e60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 7853 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a1b8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfbc84, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5cfbc70, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5cfbc70, 0x1b5c4c0, 0x48645a0, 0xb6d99600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x554f810, 0x8c, 0x18, 0x5a89160)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x554f810, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x554f810, 0x20, 0x1b5c4c0, 0x310001, 0x5a89160)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x4dc5200, 0x2064b48, 0x5b91d40, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x4dd4f20, 0x553d280, 0x4ea6200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 8958 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6e820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_Size_Net(0x4d6e820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:66 +0x20
testing.tRunner(0x4d6e820, 0x1d9d32c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7894 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4a7ab40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 8047 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5aa9710)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 8041 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a44480, 0x54ebe40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4e38ae0, 0x5a44480, 0x54ebe40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 7996 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x577e320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7998 [IO wait]:
internal/poll.runtime_pollWait(0xa532a65c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4bfc4c4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4bfc4b0, 0x5388000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4bfc4b0, 0x5388000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x59101b0, 0x5388000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5b04d80, 0x5c14190, 0xc, 0xc, 0x52d2b60, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5b04d80, 0x5c14190, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x52d2a80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x52d2a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8042 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x52d5ab0, 0x1d9d490, 0x4a7ab40, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4a7ab40, 0x2080d50, 0x595c8a8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4a7ab40, 0x2080d50, 0x595c8a8, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7999 [select]:
github.com/hashicorp/yamux.(*Session).send(0x52d2a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 8000 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x52d2a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8001 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x52d2fc0, 0x53b7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x5b04e10)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x5b04e10, 0x9fd8660, 0x0, 0x52d2fc4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x52ed280, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5b20af0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x5b04e40, 0x1845a70, 0x5d4b820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x5b04e40, 0x1845a70, 0x5d4b820, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x5b04de0, 0x1845a70, 0x5d4b820, 0x5d4b820, 0x5a1dc38)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x5b04de0, 0x5d4b820, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x5a1dc20, 0x2074928, 0x5b04de0, 0x568bf90, 0x48a0400, 0x2074901, 0x0, 0x0, 0xdd)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x5a1dc20, 0x2074928, 0x5b04de0, 0x9fb68c0, 0x57a21e0, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9fb68c0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x5a1dc20, 0x2074928, 0x5b04de0, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4a7ab40, 0x2080c30, 0x52d2fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 8048 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a4d0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcf234, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5bcf220, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5bcf220, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4e62900, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4e62900, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5c09880, 0x4e62900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 8053 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5506620)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 8045 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4ccf560)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 8055 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x48dca00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x48dca00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x48dca00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x48dca00, 0x5910ab0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 8056 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x48dca00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x48dca00, 0x5910ab8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 8057 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x48dca00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x48dca00, 0x5910ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 8049 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a44c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcf2d4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5bcf2c0, 0x5804000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5bcf2c0, 0x5804000, 0x10000, 0x10000, 0x0, 0x5c14b01, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x595cd18, 0x5804000, 0x10000, 0x10000, 0x55b0734, 0x101, 0x55b0708, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x595cd18, 0x5804000, 0x10000, 0x10000, 0x19cccb1, 0x3d, 0xe4, 0x5c14b74, 0x1)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5c09880, 0x595cd18)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 8066 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4ccd290)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 8067 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4ccd290)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 8068 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4ccd290)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8069 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccd290, 0x2a05f200, 0x1, 0x5c09a40, 0x54d4880, 0x595cfa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8070 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4ccd290, 0x54d4880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 8071 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4ccd290, 0x1dcd6500, 0x0, 0x5c09a80, 0x54d4880, 0x595cfb0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8072 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x54de7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 8073 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x54de7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8074 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54de7e0, 0x1cae9dd, 0x6, 0x4ccf660)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 8075 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54de7e0, 0x1cad4b5, 0x5, 0x4ccf6e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8076 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54de7e0, 0x1cad6ef, 0x5, 0x4ccf700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 8077 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x52d6e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7904 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5104480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7905 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5104480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 8082 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a3c8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a38c4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57a38b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57a38b0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a8b340, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a8b340, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5ca30c0, 0x4a8b340)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 8083 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a344, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a3914, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x57a3900, 0x5924000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x57a3900, 0x5924000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x50ed248, 0x5924000, 0x10000, 0x10000, 0x56ccf34, 0x101, 0x56ccf08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x50ed248, 0x5924000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5ca30c0, 0x50ed248)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 8084 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x5ceeb00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 8085 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x5ceeb00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 8086 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x5ceeb00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8087 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x5ceeb00, 0x3b9aca00, 0x0, 0x5ca3240, 0x54e2200, 0x50ed508)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8088 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x5ceeb00, 0x54e2200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 8089 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x5ceeb00, 0xbebc200, 0x0, 0x5ca3280, 0x54e2200, 0x50ed518)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8090 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x54ded80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 8091 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x54ded80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8092 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54ded80, 0x1cae9dd, 0x6, 0x524e5c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 8093 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54ded80, 0x1cad4b5, 0x5, 0x524e600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8094 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x54ded80, 0x1cad6ef, 0x5, 0x524e680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 8095 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x48cb200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 8096 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5ca32c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 8097 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x517a9f0, 0x52dd900, 0x1cac33a, 0x3, 0x5965e80, 0x52dd800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 8098 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x48cb200, 0x0, 0x1d9d4d8, 0x54de7e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 8099 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x48cb200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 8100 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a554, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a3694, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57a3680, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57a3680, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5252480, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x5252480, 0x2, 0x2, 0x3f800000, 0x5911a60)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x48cb200, 0x206b9a8, 0x5252480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 8101 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x48cb200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 8102 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x524c3c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x4dd4f20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 8103 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x4dd4f20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 8104 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x4dd4f20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 8105 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x4dd4f20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 8106 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a23c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5cfbc34, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5cfbc20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5cfbc20, 0x48645a0, 0xb6d996d0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x554f800, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x554f800, 0x5b91cc0, 0x5cfbc20, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x5cea280, 0x206b9a8, 0x554f800, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x5cea280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x524c450, 0x1cac2e9, 0x3, 0x5c149e0, 0xf, 0x554f7f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4dd4f20, 0x524c450, 0x5ca3380, 0x5ca33c0, 0x205db30, 0x51fd940)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 8107 [IO wait]:
internal/poll.runtime_pollWait(0xa532a2c0, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57a3a54, 0x72, 0xff00, 0xffff, 0x83d22a0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x57a3a40, 0x9fa2000, 0xffff, 0xffff, 0x83d22a0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x57a3a40, 0x9fa2000, 0xffff, 0xffff, 0x83d22a0, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x50ed5c8, 0x9fa2000, 0xffff, 0xffff, 0x83d22a0, 0x28, 0x28, 0xb6d996d0, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x50ed5c8, 0x9fa2000, 0xffff, 0xffff, 0x83d22a0, 0x28, 0x28, 0xb6d996d0, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x50ed5c8, 0x9fa2000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x5c6ac00, 0x50ed5c8, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x50ed5f0, 0x50ed5c8, 0x77359400, 0x0, 0x9f7aff0, 0x1, 0x0, 0x0, 0x2054770, 0x9f7aff0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x5c6ac00, 0x50ed5c8, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x5c6ac00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x524c480, 0x1cac30d, 0x3, 0x4ec7110, 0xf, 0x4bc6320, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4dd4f20, 0x524c480, 0x5ca3380, 0x5ca33c0, 0x205db48, 0x51fd9c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 8079 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x48cb200, 0x54d4e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x54cde00, 0x48cb200, 0x54d4e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8063 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa5329c90, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5a61414, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5a61400, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5a61400, 0x1b5c4c0, 0x4bb2b40, 0xb6d99a00)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5165740, 0xc6, 0x18, 0x5c1b8a0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5165740, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x5165740, 0x20, 0x1b5c4c0, 0x310001, 0x5c1b8a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x5d62360, 0x2064b48, 0x5910c60, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x5c66160, 0x53203c0, 0x5bed4c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 8121 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x5c35590)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 8122 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x535f010)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 8123 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5a1aab0, 0x0, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x50cc700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 8131 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x5826f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 8108 [IO wait]:
internal/poll.runtime_pollWait(0xa532a0b0, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcfb44, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5bcfb30, 0x590f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5bcfb30, 0x590f000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x50ed650, 0x590f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5c2a360, 0x54cdef0, 0xc, 0xc, 0x4f30e00, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5c2a360, 0x54cdef0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4f30d20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4f30d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8110 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4f30d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8061 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4ff2c40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8062 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4ff2cb0, 0x5934000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x5076ae0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x5076ae0, 0x9fef740, 0x0, 0x4ff2cb4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5b2cce0, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5164650)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x5076b10, 0x1845a70, 0x5cc6640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x5076b10, 0x1845a70, 0x5cc6640, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x5076ab0, 0x1845a70, 0x5cc6640, 0x5cc6640, 0x5a4c018)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x5076ab0, 0x5cc6640, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x5a4c000, 0x2074928, 0x5076ab0, 0x59dcf90, 0x48a0400, 0x2074901, 0x0, 0x0, 0x1b4)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x5a4c000, 0x2074928, 0x5076ab0, 0x9ff68b0, 0x5a60dc0, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9ff68b0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x5a4c000, 0x2074928, 0x5076ab0, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x5a44480, 0x2080c30, 0x4ff2cb0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 8111 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x4f30a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 8112 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x4f30a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 17166 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6ddf3c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x61c6fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x61c6fc0, 0x67ad580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6b32c30, 0x61c6fc0, 0x67ad580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8148 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x5a47200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a47200, 0x50edaf8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 8149 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x5a47200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x5a47200, 0x50edb00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 8125 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5c4e520)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 8126 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5ba3e60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 8127 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5ba3e60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 8128 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa5329fa8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5903a04, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x59039f0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x59039f0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5bc0820, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5bc0820, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5ab4080, 0x5bc0820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 8129 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa5329f24, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5903a54, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5903a40, 0x5e38000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5903a40, 0x5e38000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x551ac70, 0x5e38000, 0x10000, 0x10000, 0x58c8f34, 0x101, 0x58c8f08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x551ac70, 0x5e38000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5ab4080, 0x551ac70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 8162 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x48dbe40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 8163 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x48dbe40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 8164 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x48dbe40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8165 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x48dbe40, 0x2a05f200, 0x1, 0x5ab4200, 0x5a3c9c0, 0x551aef8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8166 [select, 1 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x48dbe40, 0x5a3c9c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 8167 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x48dbe40, 0x1dcd6500, 0x0, 0x5ab4240, 0x5a3c9c0, 0x551af08)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8168 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x58f30e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 8169 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x58f30e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8170 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x58f30e0, 0x1cae9dd, 0x6, 0x5c4e5e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 8171 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x58f30e0, 0x1cad4b5, 0x5, 0x5c4e600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8172 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x58f30e0, 0x1cad6ef, 0x5, 0x5c4e620)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 8173 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5c5eea0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 8174 [select, 5 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x5d04510)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 8175 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5d04510)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 8176 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa5329ea0, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5903b94, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5903b80, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5903b80, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5bc10c0, 0x0, 0x0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5bc10c0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5ab4400, 0x5bc10c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 8177 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa5329e1c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5903be4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5903bd0, 0x5e48000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5903bd0, 0x5e48000, 0x10000, 0x10000, 0x5a2ed00, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x551af68, 0x5e48000, 0x10000, 0x10000, 0x5a06f34, 0x101, 0x5a06f08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x551af68, 0x5e48000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5ab4400, 0x551af68)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 8178 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4da4420)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 8179 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4da4420)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 8180 [select, 5 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4da4420)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8181 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da4420, 0x3b9aca00, 0x0, 0x5ab4640, 0x5a3ce80, 0x551b210)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8182 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4da4420, 0x5a3ce80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 8183 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4da4420, 0xbebc200, 0x0, 0x5ab49c0, 0x5a3ce80, 0x551b220)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8184 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x58f3680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 8185 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x58f3680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8186 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x58f3680, 0x1cae9dd, 0x6, 0x5c5ef80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 8187 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x58f3680, 0x1cad4b5, 0x5, 0x5c5efa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8188 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x58f3680, 0x1cad6ef, 0x5, 0x5c5f000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 8189 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x5a8b8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 8190 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5ab4a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 8191 [select, 5 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x5a1a960, 0x58b1600, 0x1cac33a, 0x3, 0x5a3c5c0, 0x58b1580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 8192 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x5a8b8c0, 0x0, 0x1d9d4d8, 0x58f30e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 8193 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x5a8b8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 8194 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa532a134, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x59035f4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x59035e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x59035e0, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x535f020, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x535f020, 0x2, 0x2, 0x3f800000, 0x54f83b0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5a8b8c0, 0x206b9a8, 0x535f020)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 8195 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x5a8b8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 8196 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x5b9cc00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x5c66160)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 8197 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x5c66160)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 8198 [select, 5 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x5c66160)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 8199 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x5c66160)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 8200 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0xa5329d14, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5bcfdc4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5bcfdb0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5bcfdb0, 0x48645a0, 0xb6d99008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x593f840, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x593f840, 0x595d658, 0x5bcfdb0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x48b7b80, 0x206b9a8, 0x593f840, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x48b7b80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5b9cc30, 0x1cac2e9, 0x3, 0x53a6eb0, 0xf, 0x593f830, 0x30282073, 0x7331302e)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5c66160, 0x5b9cc30, 0x5ab4ac0, 0x5ab4b00, 0x205db30, 0x5c126e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 8201 [IO wait]:
internal/poll.runtime_pollWait(0xa5329d98, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5903d24, 0x72, 0xff00, 0xffff, 0x80ed500)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5903d10, 0x9fb8000, 0xffff, 0xffff, 0x80ed500, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5903d10, 0x9fb8000, 0xffff, 0xffff, 0x80ed500, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x551b2d0, 0x9fb8000, 0xffff, 0xffff, 0x80ed500, 0x28, 0x28, 0xb6d99a34, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x551b2d0, 0x9fb8000, 0xffff, 0xffff, 0x80ed500, 0x28, 0x28, 0xb6d99a34, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x551b2d0, 0x9fb8000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x50ccf80, 0x551b2d0, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x551b2f8, 0x551b2d0, 0x77359400, 0x0, 0x9f966c0, 0x1, 0x0, 0x0, 0x2054770, 0x9f966c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x50ccf80, 0x551b2d0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x50ccf80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5b9cc60, 0x1cac30d, 0x3, 0x5cf1130, 0xf, 0x5bc18d0, 0x7279656b, 0x20676e69)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5c66160, 0x5b9cc60, 0x5ab4ac0, 0x5ab4b00, 0x205db48, 0x5c12720)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 8212 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x48cb200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 8204 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x5c35590, 0x1cc0714, 0xf, 0x2052fd0, 0x5a6d500, 0xa, 0x0, 0x1828b00, 0x56e4300, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5c35590, 0x20732c8, 0x5afdc60, 0x1cc0714, 0xf, 0x2052fd0, 0x5a6d500, 0x1cae1b2, 0x5, 0x56e41c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 8210 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5c35540, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5c35540, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5c35540, 0x1cb2cd8, 0x8, 0x5685fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5c35540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 33910 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x9b24750)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 8205 [select, 1 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5c35590, 0x20732c8, 0x5afdc60, 0x1cc0705, 0xf, 0x2052e80, 0x5a4d860, 0x1cacd60, 0x4, 0x56e41c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:122 +0x2ac
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 8215 [select]:
github.com/hashicorp/yamux.(*Session).send(0x595ac40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 29850 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x895f300, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5ac5200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac5200, 0x82f8e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6a54fd0, 0x5ac5200, 0x82f8e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8151 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a8b8c0, 0x5117700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x53a6e90, 0x5a8b8c0, 0x5117700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8136 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x5bfa0a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 8214 [IO wait]:
internal/poll.runtime_pollWait(0xa5329c0c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5af2294, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5af2280, 0x5b0b000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5af2280, 0x5b0b000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5911a40, 0x5b0b000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5a9f080, 0x548b800, 0xc, 0xc, 0x595ad20, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5a9f080, 0x548b800, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x595ac40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x595ac40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8216 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x595ac40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8217 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x595ad90, 0x1d9d490, 0x48cb200, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x48cb200, 0x2080d50, 0x5911a60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x48cb200, 0x2080d50, 0x5911a60, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 8218 [IO wait]:
internal/poll.runtime_pollWait(0xa5329b88, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5a619b4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5a619a0, 0x5b1e000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5a619a0, 0x5b1e000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5911a60, 0x5b1e000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5a9f1d0, 0x548b830, 0xc, 0xc, 0x595ae00, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5a9f1d0, 0x548b830, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x595ad90, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x595ad90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8219 [select]:
github.com/hashicorp/yamux.(*Session).send(0x595ad90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 8220 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x595ad90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8221 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x595ae00, 0x5b1f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x5a9f260)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x5a9f260, 0xa025b00, 0x0, 0x595ae04)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5b181e0, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5c49660)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x5a9f290, 0x1845a70, 0x5a89260)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x5a9f290, 0x1845a70, 0x5a89260, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x5a9f230, 0x1845a70, 0x5a89260, 0x5a89260, 0x517ab88)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x5a9f230, 0x5a89260, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x517ab70, 0x2074928, 0x5a9f230, 0x5d6ff90, 0x48a0400, 0x13001, 0x0, 0xa334b9b8, 0xfd)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x517ab70, 0x2074928, 0x5a9f230, 0xa030990, 0x57a3540, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0xa030990, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x517ab70, 0x2074928, 0x5a9f230, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x48cb200, 0x2080c30, 0x595ae00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 8206 [select, 5 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x5c35590, 0x1cc0a02, 0xf, 0x2052fe8, 0x5b862a0, 0x1, 0x0, 0x1828bc8, 0x5a4def0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5c35590, 0x20732c8, 0x5afdc60, 0x1cc0a02, 0xf, 0x2052fe8, 0x5b862a0, 0x1cb690f, 0xa, 0x56e41c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 8207 [select, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*state).run(0x5a6d490)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:220 +0x1c0
created by github.com/hashicorp/consul/agent/proxycfg.(*state).Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:106 +0xbc

goroutine 8208 [chan receive, 5 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked.func1(0x5b9cc00, 0x56e4200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:195 +0x68
created by github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:193 +0x164

goroutine 8965 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6ec80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiNodeInfo(0x4d6ec80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:149 +0x20
testing.tRunner(0x4d6ec80, 0x1d9d34c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8348 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5d45a40, 0x54f82a8, 0x20732c8, 0x5becf60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 8234 [select, 5 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x56e43c0, 0x551b7b0, 0x20732c8, 0x56ea700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 11489 [select, 4 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x5ffe190, 0x1cc0714, 0xf, 0x2052fd0, 0x62a6e70, 0x1, 0x0, 0x1828b00, 0x62bb600, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5ffe190, 0x20732c8, 0x68906a0, 0x1cc0714, 0xf, 0x2052fd0, 0x62a6e70, 0x1cae1b2, 0x5, 0x62bb540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33959 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x9b1ec60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 8770 [select, 5 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x5096a10, 0x1d9d490, 0x5a8b8c0, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x5a8b8c0, 0x2080d50, 0x54f83b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x5a8b8c0, 0x2080d50, 0x54f83b0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 8343 [select, 5 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x5a8b8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 8956 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6e6e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_JSON(0x4d6e6e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:21 +0x20
testing.tRunner(0x4d6e6e0, 0x1d9d324)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8957 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6e780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_Size_Item(0x4d6e780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:40 +0x20
testing.tRunner(0x4d6e780, 0x1d9d328)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 27537 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x3da89c8a, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xb, 0x1cc0705, 0xf, 0x5a7fe00, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x595ddc8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33234 [select]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x70dd3f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 12856 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6dfe660, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a45d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a45d40, 0x6903740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6943470, 0x5a45d40, 0x6903740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33248 [IO wait]:
internal/poll.runtime_pollWait(0xa5332650, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x89b6d84, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x89b6d70, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x89b6d70, 0x1, 0x2020501, 0x405)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x8da9200, 0x1, 0x9439900, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x8da9200, 0x2, 0x10000, 0x10000)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x83251c0, 0x8da9200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 28837 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x8cbd1c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x89be480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x89be480, 0x8eb9480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8c64d10, 0x89be480, 0x8eb9480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33211 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x86a9200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x86a9200, 0x4e25000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 20593 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x70ecc60, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x763ed80, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x763ed80, 0x708f7c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7d1f9d0, 0x763ed80, 0x708f7c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 21967 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x61bea20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x639c000, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x639c000, 0x6bbfdc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x53a7210, 0x639c000, 0x6bbfdc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 11644 [select, 4 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x62bb840, 0x595dd50, 0x20732c8, 0x68912c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33256 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x852f200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 29092 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x771c5a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x91086c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x91086c0, 0x8257900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8f70840, 0x91086c0, 0x8257900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8233 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x56ea700, 0x5e129f4, 0x20, 0x20, 0x1, 0x5cb04c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x56ea620, 0x20732c8, 0x56ea700, 0x551b7b0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x56ea620, 0x56e43c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5a8b8c0, 0x5b86378, 0x5b5c00c, 0x5e12b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x523fe78, 0x5b86360, 0x5b5c000, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x523fec8, 0x13, 0x1cac698, 0x4, 0x5e12d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x523fec8, 0x13, 0x5c7954c, 0x3, 0x3, 0x42646c01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5c13d60, 0x5a1aa50, 0x5cf1d58, 0x0, 0x5902b40, 0x5c1b8e0, 0x1b47298, 0x5b86360, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5a1aa50, 0x2073568, 0x56ea5c0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5a8b8c0, 0x1cbfad5, 0xf, 0x1b47298, 0x5b862a0, 0x1828bc8, 0x5a4dfb0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x5c66160, 0x1cbfad5, 0xf, 0x1b47298, 0x5b862a0, 0x1828bc8, 0x5a4dfb0, 0x5a4df84, 0x48654a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x551b260, 0x1, 0x0, 0xb2c97000, 0x8b, 0x56ea5a0, 0x2052fe8, 0x5b862a0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x551b260, 0x4da3220, 0x1828bc8, 0x5a4def0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 34045 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0xa166240, 0x32cb2c0, 0x0, 0x0, 0x1)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x918bd00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 17587 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6af6c00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x73b58c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x73b58c0, 0x6f039c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5c954e0, 0x73b58c0, 0x6f039c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 23139 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x62bc280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7518240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7518240, 0x7152200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6f4ade0, 0x7518240, 0x7152200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 34044 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0xa13bb50)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 31343 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x81374c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x75198c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x75198c0, 0x953c500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x64db790, 0x75198c0, 0x953c500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 26200 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7cf75e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x89bf680, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x89bf680, 0x802c300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7ef2130, 0x89bf680, 0x802c300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33911 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x9b24750)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 33446 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x903e690)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 33915 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x9b66e70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 31003 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xa74429a8, 0x13)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xa, 0x1cc0705, 0xf, 0x638c000, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x81c0c40, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33920 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x9b1e6c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 33928 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x9ebc000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x9ebc000, 0x9c988f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 33240 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x852eb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 32346 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x7d19fc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x89bf8c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x89bf8c0, 0x862af80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x9673770, 0x89bf8c0, 0x862af80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 31341 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x221144ef, 0x18)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xb, 0x1cc0705, 0xf, 0x5bd28a0, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x651c528, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33685 [select]:
github.com/hashicorp/yamux.(*Session).send(0x96ccd90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 33221 [IO wait]:
internal/poll.runtime_pollWait(0xa5332128, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x8fb6154, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x8fb6140, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x8fb6140, 0x1b5c4c0, 0x4ba0000, 0xb6d99300)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x8dbcb80, 0xb6, 0x18, 0x8ba16a0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x8dbcb80, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x8dbcb80, 0x20, 0x1b5c4c0, 0x310001, 0x8ba16a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x87e05a0, 0x2064b48, 0x8c3d688, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x8ae02c0, 0x8e8e900, 0x8a9a9a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 33237 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x70dd3f0, 0x2a05f200, 0x1, 0x8324e40, 0x90544c0, 0x4e25378)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 31830 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x9a66c00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x89bfd40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x89bfd40, 0x925d3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x88dc030, 0x89bfd40, 0x925d3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33447 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x903e690)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 33904 [select]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x99588f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 33966 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x99cd200, 0x0, 0x1d9d4d8, 0x9b1e6c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 33958 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x9b1ec60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 33238 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x70dd3f0, 0x90544c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:152 +0xe0
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 21540 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6d7cae0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x71d2240, 0x7397d0d, 0x1ac5810, 0x65ef980, 0x40, 0x1bcef80, 0x6a2a401, 0x65ef980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:370 +0x120
github.com/hashicorp/consul/agent/consul.(*consulCADelegate).ApplyCARequest(0x7a39300, 0x65ef980, 0x1, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/consul_ca_delegate.go:19 +0x38
github.com/hashicorp/consul/agent/connect/ca.(*ConsulProvider).Configure(0x65ef900, 0x6801561, 0x24, 0x6072d01, 0x725cae0, 0x6e5e0c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/connect/ca/provider_consul.go:106 +0x648
github.com/hashicorp/consul/agent/consul.(*Server).initializeRootCA(0x71d2240, 0x20851d0, 0x65ef900, 0x7509770, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1005 +0x48
github.com/hashicorp/consul/agent/consul.(*Server).initializeCA(0x71d2240, 0x68abdc0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader_oss.go:24 +0x90
github.com/hashicorp/consul/agent/consul.(*Server).establishLeadership(0x71d2240, 0x2, 0x2)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:324 +0x134
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x71d2240, 0x71ef4c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:176 +0x624
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6ca0570, 0x71d2240, 0x71ef4c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 15351 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6ab3580, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x75186c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x75186c0, 0x6ed2980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6e8a7e0, 0x75186c0, 0x6ed2980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33244 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x852eb40, 0x1cad6ef, 0x5, 0x89efae0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 29472 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x9426fbb4, 0x15)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xb, 0x1cc0705, 0xf, 0x5c051c0, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x7aaae70, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29912 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x7e1cf60, 0x59a19f4, 0x20, 0x20, 0x1, 0x7c15fd0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x7e1ce80, 0x20732c8, 0x7e1cf60, 0x7a38e68, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x7e1ce80, 0x68b4e40, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5ac5200, 0x7958a98, 0x73b732c, 0x59a1b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x71c5db8, 0x7958a80, 0x73b7320, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x71c5e08, 0x13, 0x1cac698, 0x4, 0x59a1d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x71c5e08, 0x13, 0x5d50d4c, 0x3, 0x3, 0x869c9601, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6b72720, 0x6081380, 0x6c21e18, 0x0, 0x7e4f5e0, 0x7397740, 0x1b47298, 0x7958a80, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x6081380, 0x2073568, 0x7e1ce20, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5ac5200, 0x1cbfad5, 0xf, 0x1b47298, 0x767e180, 0x1828bc8, 0x73b72f0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7500c60, 0x1cbfad5, 0xf, 0x1b47298, 0x767e180, 0x1828bc8, 0x73b72f0, 0x73b7294, 0x4ba0000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x7a386a0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x7e1ce00, 0x2052fe8, 0x767e180, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x7a386a0, 0x70c5160, 0x1828bc8, 0x5d54c30, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33961 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x9b1ec60, 0x1cad4b5, 0x5, 0x9ed1ce0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 33620 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x95b7800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x95b7800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x95b7800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x95b7800, 0x99c04a8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 33216 [IO wait]:
internal/poll.runtime_pollWait(0xa53322b4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x89b6bf4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x89b6be0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x89b6be0, 0x3, 0x3, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x8da8a00, 0x5b5f774, 0x4bed334, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x8da8a00, 0x12b94, 0x1ba48, 0x91dbd70)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x8324cc0, 0x8da8a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 29296 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x610c3c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7762240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7762240, 0x7ea4c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x78a5660, 0x7762240, 0x7ea4c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33919 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x9b66e70, 0x1dcd6500, 0x0, 0x9e8db80, 0x9c190c0, 0x9ae1fe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 19614 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7ad0de0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x736f200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x736f200, 0x723e680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6d7afd0, 0x736f200, 0x723e680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33239 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x70dd3f0, 0x1dcd6500, 0x0, 0x8325000, 0x90544c0, 0x4e25388)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 30680 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x86257c0, 0x1cc0714, 0xf, 0x2052fd0, 0x8dda7e0, 0x1, 0x0, 0x1828b00, 0x7ac7100, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x86257c0, 0x20732c8, 0x837c420, 0x1cc0714, 0xf, 0x2052fd0, 0x8dda7e0, 0x1cae1b2, 0x5, 0x7b23b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33656 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x974f640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 33214 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x832ed80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 33265 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x7814240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:67 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 33213 [select]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x89efa00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 28908 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x8622fa0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x89f9680, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x89f9680, 0x8fe5400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8c65950, 0x89f9680, 0x8fe5400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29391 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5b69180, 0x637da0c, 0x20, 0x20, 0x6abdbf0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5b690c0, 0x20732c8, 0x5b69180, 0x68bd4e0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5b690c0, 0x62d3c40, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x73b4000, 0x5ff64fc, 0x62d3c1c, 0x637db70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x764eb90, 0x5ff64d0, 0x62d3c00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x6ffaf50, 0x13, 0x1cac698, 0x4, 0x637dd4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x6ffaf50, 0x13, 0x5fbcd4c, 0x3, 0x3, 0xb1801501, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6a5ba20, 0x6def770, 0x5c15b88, 0x0, 0x764ec80, 0x514c0a0, 0x1b9a5b0, 0x5ff64d0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x6def770, 0x2073568, 0x5b68ec0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x73b4000, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x5a6d180, 0x1828b00, 0x62d3bc0, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48c9340, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x5a6d180, 0x1828b00, 0x62d3bc0, 0x65c5114, 0x48f61e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x60cb9e0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x5b68ea0, 0x2052fd0, 0x5a6d180, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x60cb9e0, 0x5a710c0, 0x1828b00, 0x594fcc0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33927 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x9ebc000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x9ebc000, 0x9c988f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 33510 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x7814240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1157 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1152 +0xac

goroutine 31236 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x854d07e0, 0x13)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0x9, 0x1cc0705, 0xf, 0x5d7eac0, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x6258df8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29464 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x8399180, 0x1cc0a02, 0xf, 0x2052fe8, 0x61ccea0, 0x1, 0x0, 0x1828bc8, 0x5b5d1a0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x8399180, 0x20732c8, 0x5aead80, 0x1cc0a02, 0xf, 0x2052fe8, 0x61ccea0, 0x1cb690f, 0xa, 0x58b9740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33909 [select]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x9ec2ce0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 29569 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x68a83a0, 0x59ae9f4, 0x20, 0x20, 0x1, 0x52d85d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x68a82a0, 0x20732c8, 0x68a83a0, 0x4bc99c0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x68a82a0, 0x5b065c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5c478c0, 0x5df2678, 0x6835aac, 0x59aeb7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x60bb9a0, 0x5df2660, 0x6835aa0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x60bb9f8, 0x13, 0x1cac698, 0x4, 0x59aed4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x60bb9f8, 0x13, 0x5d8b54c, 0x3, 0x3, 0xf075a701, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6681220, 0x50f9e00, 0x4e38598, 0x0, 0x8cddae0, 0x60da000, 0x1b47298, 0x5df2660, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x50f9e00, 0x2073568, 0x68a81c0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5c478c0, 0x1cbfad5, 0xf, 0x1b47298, 0x61cd080, 0x1828bc8, 0x6835a70, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x5c67e40, 0x1cbfad5, 0xf, 0x1b47298, 0x61cd080, 0x1828bc8, 0x6835a70, 0x6835a44, 0x48f6000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x5e2fe70, 0x1, 0x0, 0xb2c97000, 0x8b, 0x68a8180, 0x2052fe8, 0x61cd080, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x5e2fe70, 0x628c420, 0x1828bc8, 0x6506690, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 31214 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xe71cc153, 0x16)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xb, 0x1cc0705, 0xf, 0x4bcb860, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x549d290, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 34030 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0xa07cf60, 0x73bd9f4, 0x20, 0x20, 0x1, 0x9ff7ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0xa07ce80, 0x20732c8, 0xa07cf60, 0xa0274a0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0xa07ce80, 0xa022f40, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x8d85680, 0x9f8e498, 0xa03541c, 0x73bdb7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x8f33c10, 0x9f8e480, 0xa035410, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x8f33c60, 0x13, 0x1cac698, 0x4, 0x73bdd4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x8f33c60, 0x13, 0x542dd4c, 0x3, 0x3, 0x934fe201, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x95fa4c0, 0x90203c0, 0xa084908, 0x0, 0x96040a0, 0x9746d20, 0x1b47298, 0x9f8e480, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x90203c0, 0x2073568, 0xa07cce0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x8d85680, 0x1cbfad5, 0xf, 0x1b47298, 0x9f8e3c0, 0x1828bc8, 0xa0353e0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x8e8a000, 0x1cbfad5, 0xf, 0x1b47298, 0x9f8e3c0, 0x1828bc8, 0xa0353e0, 0xa0353b4, 0x48643c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x99c0d40, 0x1, 0x0, 0xb2c97000, 0x8b, 0xa07ccc0, 0x2052fe8, 0x9f8e3c0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x99c0d40, 0x9204de0, 0x1828bc8, 0xa035320, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33926 [runnable]:
syscall.Syscall(0x94, 0x7f, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x7f, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x99cab40, 0x1000, 0x1000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).writeMeta(0x8777280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/boltdb/bolt/tx.go:556 +0xfc
github.com/boltdb/bolt.(*Tx).Commit(0x8777280, 0x9fccd60, 0x8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/boltdb/bolt/tx.go:221 +0x3e8
github.com/hashicorp/raft-boltdb.(*BoltStore).StoreLogs(0x99983b0, 0x9f06f38, 0x2, 0x2, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:187 +0x228
github.com/hashicorp/raft.(*LogCache).StoreLogs(0x9b216b0, 0x9f06f38, 0x2, 0x2, 0x32b9570, 0x17480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/log_cache.go:61 +0x110
github.com/hashicorp/raft.(*Raft).dispatchLogs(0x9ebc000, 0x9f06f30, 0x2, 0x2)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:1061 +0x284
github.com/hashicorp/raft.(*Raft).leaderLoop(0x9ebc000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:746 +0x5ac
github.com/hashicorp/raft.(*Raft).runLeader(0x9ebc000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x9ebc000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x9ebc000, 0x9c988e8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 33690 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x96ccee0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 22869 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x75ab680, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x71d2fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x71d2fc0, 0x7780800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x796eb70, 0x71d2fc0, 0x7780800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33255 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x70dd4a0, 0xbebc200, 0x0, 0x8325500, 0x90549c0, 0x4e256b8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 18935 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6e01b40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x52d3ab0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x52d3ab0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 17850 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5b95900, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5c46480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c46480, 0x66cb780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5ce55f0, 0x5c46480, 0x66cb780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 30678 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x72022e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8d84240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8d84240, 0x895d3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x91261e0, 0x8d84240, 0x895d3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33251 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x70dd4a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 30687 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x7b23bc0, 0x8dcef28, 0x20732c8, 0x837cca0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 18437 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x607f440, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x71d3d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x71d3d40, 0x6cc6900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x782acf0, 0x71d3d40, 0x6cc6900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 22454 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x79b0cc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x73b5b00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x73b5b00, 0x7972240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6e8ac70, 0x73b5b00, 0x7972240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29567 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x684fe20, 0x6382a0c, 0x20, 0x20, 0x52d83b0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x684fd80, 0x20732c8, 0x684fe20, 0x4bc9848, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x684fd80, 0x5b06080, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5c478c0, 0x65a84fc, 0x5b0601c, 0x6382b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x8cdd6d0, 0x65a84d0, 0x5b06000, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x60bb910, 0x13, 0x1cac698, 0x4, 0x6382d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x60bb910, 0x13, 0x5fc054c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x66810a0, 0x50f9e00, 0x4ad8310, 0x0, 0x8cdd7c0, 0x60a9040, 0x1b9a5b0, 0x65a84d0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x50f9e00, 0x2073568, 0x684fca0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5c478c0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x5a6d9d0, 0x1828b00, 0x5b55fc0, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x5c67e40, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x5a6d9d0, 0x1828b00, 0x5b55fc0, 0x6835864, 0x4bb34a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x5e2fe68, 0x1, 0x0, 0xb2c97000, 0x8b, 0x684fc80, 0x2052fd0, 0x5a6d9d0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x5e2fe68, 0x628c3e0, 0x1828b00, 0x5ab5b80, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29913 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x68b4e40, 0x7a38e68, 0x20732c8, 0x7e1cf60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33257 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x852f200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 21331 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x73a8cc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x639c900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x639c900, 0x75811c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6b32710, 0x639c900, 0x75811c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33268 [chan receive]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x5ebd140, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x8ae02c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 20959 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6209e20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5c46240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c46240, 0x5c64400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5c15d90, 0x5c46240, 0x5c64400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29403 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x4f6d7c0, 0x6259888, 0x20732c8, 0x5b99c40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 30932 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x72024e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4bf4d80, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4bf4d80, 0x7e9dbc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x79815d0, 0x4bf4d80, 0x7e9dbc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33954 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x99588f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 33903 [IO wait]:
internal/poll.runtime_pollWait(0xa5331d8c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9c1b5a4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x9c1b590, 0x9f3a000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x9c1b590, 0x9f3a000, 0x10000, 0x10000, 0x0, 0x32bc201, 0x1, 0x0, 0x1)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x9f06168, 0x9f3a000, 0x10000, 0x10000, 0x5fbef34, 0x101, 0x5fbef08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x9f06168, 0x9f3a000, 0x10000, 0x10000, 0x2, 0x1, 0x0, 0x6eaddc, 0x7a23b00)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x9f08340, 0x9f06168)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 29163 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x7d88fa0, 0x1cc0a02, 0xf, 0x2052fe8, 0x7c7f3e0, 0x1, 0x0, 0x1828bc8, 0x6deeab0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x7d88fa0, 0x20732c8, 0x75ab080, 0x1cc0a02, 0xf, 0x2052fe8, 0x7c7f3e0, 0x1cb690f, 0xa, 0x6ecb580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33273 [IO wait]:
internal/poll.runtime_pollWait(0xa5331e10, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x89b6f14, 0x72, 0xff00, 0xffff, 0x8170c30)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x89b6f00, 0x9edc000, 0xffff, 0xffff, 0x8170c30, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x89b6f00, 0x9edc000, 0xffff, 0xffff, 0x8170c30, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4e25768, 0x9edc000, 0xffff, 0xffff, 0x8170c30, 0x28, 0x28, 0xb6d99008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4e25768, 0x9edc000, 0xffff, 0xffff, 0x8170c30, 0x28, 0x28, 0xb6d99008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4e25768, 0x9edc000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x6ad3280, 0x4e25768, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4e257a0, 0x4e25768, 0x77359400, 0x0, 0x9ed42a0, 0x1, 0x0, 0x0, 0x2054770, 0x9ed42a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x6ad3280, 0x4e25768, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x6ad3280, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5ebd1a0, 0x1cac30d, 0x3, 0x79817b0, 0xf, 0x8da9c00, 0x1, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x8ae02c0, 0x5ebd1a0, 0x8325680, 0x83256c0, 0x205db48, 0x7ae1060)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 23259 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5bf53a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a50240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a50240, 0x731efc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5af6ca0, 0x5a50240, 0x731efc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 23142 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4fa79c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6f12fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6f12fc0, 0x71527c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5c95e80, 0x6f12fc0, 0x71527c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 26694 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x67aa120, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7ddb440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7ddb440, 0x79aff80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x70644e0, 0x7ddb440, 0x79aff80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 31335 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x7d33880, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a50900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a50900, 0x854fd40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4df9e60, 0x5a50900, 0x854fd40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 20319 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7cdada0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7273b00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7273b00, 0x6f7dac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x796f310, 0x7273b00, 0x6f7dac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 30001 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x82bd040, 0x79399f4, 0x20, 0x20, 0x1, 0x82b61d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x82bcf60, 0x20732c8, 0x82bd040, 0x80a7168, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x82bcf60, 0x65ee540, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x65e4240, 0x8706018, 0x72dc33c, 0x7939b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x601c628, 0x8706000, 0x72dc330, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x601c680, 0x13, 0x1cac698, 0x4, 0x7939d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x601c680, 0x13, 0x4950d4c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x7275dc0, 0x5f98300, 0x72d9c20, 0x0, 0x80dcbe0, 0x79db080, 0x1b47298, 0x8706000, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5f98300, 0x2073568, 0x82bcf00, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x65e4240, 0x1cbfad5, 0xf, 0x1b47298, 0x6bb1440, 0x1828bc8, 0x72dc300, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x83822c0, 0x1cbfad5, 0xf, 0x1b47298, 0x6bb1440, 0x1828bc8, 0x72dc300, 0x72dc2d4, 0x4864d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x601d8b8, 0x1, 0x0, 0xb2c97000, 0x8b, 0x82bcee0, 0x2052fe8, 0x6bb1440, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x601d8b8, 0x7eb5300, 0x1828bc8, 0x6dc9ec0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33250 [select]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x70dd4a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 33212 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x86a9200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x86a9200, 0x4e25008)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 9900 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x53ce400, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x61c6480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x61c6480, 0x5a36c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4df82b0, 0x61c6480, 0x5a36c80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29931 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x69cb300, 0x835aa68, 0x20732c8, 0x7e8d660)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 22453 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7babb80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6dec240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6dec240, 0x79721c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x70eee40, 0x6dec240, 0x79721c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 27733 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xb112e5bc, 0x17)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xb, 0x1cc0705, 0xf, 0x5cc4280, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x8ca60a0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 26397 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7990a60, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8dee240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8dee240, 0x802c880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x70dbac0, 0x8dee240, 0x802c880)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33807 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x9b35d20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 27740 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x0, 0x89526c0, 0x20732c8, 0x8c02020)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 11646 [select, 4 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x6891be0, 0x61f29f4, 0x20, 0x20, 0x1, 0x6528a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6891a40, 0x20732c8, 0x6891be0, 0x595dfc8, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6891a40, 0x62bb880, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4e37200, 0x5df3518, 0x59f592c, 0x61f2b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x60cc380, 0x5df3500, 0x59f5920, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x60cc3d0, 0x13, 0x1cac698, 0x4, 0x61f2d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x60cc3d0, 0x13, 0x58c5d4c, 0x3, 0x3, 0x4d448e01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x60dec00, 0x5fda300, 0x5a2f0b8, 0x0, 0x63bdc20, 0x6891480, 0x1b47298, 0x5df3500, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5fda300, 0x2073568, 0x68918a0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4e37200, 0x1cbfad5, 0xf, 0x1b47298, 0x5df3440, 0x1828bc8, 0x59f58f0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4e71760, 0x1cbfad5, 0xf, 0x1b47298, 0x5df3440, 0x1828bc8, 0x59f58f0, 0x59f58c4, 0x48645a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x60cd828, 0x1, 0x0, 0xb2c97000, 0x8b, 0x6891880, 0x2052fe8, 0x5df3440, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x60cd828, 0x5cd5b80, 0x1828bc8, 0x59f5830, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 16574 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x684ef80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6e04fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6e04fc0, 0x64de1c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6e63b40, 0x6e04fc0, 0x64de1c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33938 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x9b1e6c0, 0x1cae9dd, 0x6, 0x9ec2d80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 13072 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5bdf960, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6e046c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6e046c0, 0x6ab4900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x54187c0, 0x6e046c0, 0x6ab4900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29646 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x8569630, 0x1cc0714, 0xf, 0x2052fd0, 0x81e8d90, 0x1, 0x0, 0x1828b00, 0x5c61dc0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x8569630, 0x20732c8, 0x79e4700, 0x1cc0714, 0xf, 0x2052fd0, 0x81e8d90, 0x1cae1b2, 0x5, 0x5c61d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 20969 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5b094c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7762900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7762900, 0x64a0580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x74ef5a0, 0x7762900, 0x64a0580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 28323 [select, 1 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x82f3ef0, 0x1cc0714, 0xf, 0x2052fd0, 0x53b8690, 0x1, 0x0, 0x1828b00, 0x54daa40, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x82f3ef0, 0x20732c8, 0x65b4280, 0x1cc0714, 0xf, 0x2052fd0, 0x53b8690, 0x1cae1b2, 0x5, 0x54da680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33140 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x721a240, 0x5b9a7e0, 0x0, 0x9a128c0, 0x9a128c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x6ea8b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 32881 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x439536ab, 0x8)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0x8, 0x1cc0705, 0xf, 0x8f6bb40, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x85a02e0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33960 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x9b1ec60, 0x1cae9dd, 0x6, 0x9ed1cc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 29284 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x6f95760, 0x7ae79f4, 0x20, 0x20, 0x1, 0x6a583c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6f94fc0, 0x20732c8, 0x6f95760, 0x6e60b10, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6f94fc0, 0x6f0fbc0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x7ddab40, 0x5b865b8, 0x6f1f95c, 0x7ae7b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x7554560, 0x5b865a0, 0x6f1f950, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x75545b0, 0x13, 0x1cac698, 0x4, 0x7ae7d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x75545b0, 0x13, 0x5d51d4c, 0x3, 0x3, 0x37091d01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x7ea8ba0, 0x7b07740, 0x74eef78, 0x0, 0x76d6000, 0x74c4900, 0x1b47298, 0x5b865a0, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x7b07740, 0x2073568, 0x6f94f60, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x7ddab40, 0x1cbfad5, 0xf, 0x1b47298, 0x7c7f3e0, 0x1828bc8, 0x6f1f920, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7eb9760, 0x1cbfad5, 0xf, 0x1b47298, 0x7c7f3e0, 0x1828bc8, 0x6f1f920, 0x6f1f8f4, 0x48f61e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x6ffa4d0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x6f94f20, 0x2052fe8, 0x7c7f3e0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x6ffa4d0, 0x6464480, 0x1828bc8, 0x6deeab0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 26500 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x74c79e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x71d2b40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x71d2b40, 0x68b1bc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7767440, 0x71d2b40, 0x68b1bc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33806 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x99927d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 15157 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5e29de0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5ac4d80, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac4d80, 0x5a80940)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x65f3020, 0x5ac4d80, 0x5a80940)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 17704 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5b0d7e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x736efc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x736efc0, 0x66fe9c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x64db420, 0x736efc0, 0x66fe9c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 20248 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6c16740, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x62fefc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x62fefc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 28330 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x54dac00, 0x5860ed8, 0x20732c8, 0x65b4860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33914 [select]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x9b66e70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 33348 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7814240, 0x8e8eac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:221 +0x274
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7567bd0, 0x7814240, 0x8e8eac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33242 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x852eb40, 0x1cae9dd, 0x6, 0x89efaa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 29648 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x8569630, 0x1cc0a02, 0xf, 0x2052fe8, 0x6bb1440, 0x1, 0x0, 0x1828bc8, 0x6dc9ec0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x8569630, 0x20732c8, 0x79e4700, 0x1cc0a02, 0xf, 0x2052fe8, 0x6bb1440, 0x1cb690f, 0xa, 0x5c61d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 29386 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x62d2fc0, 0x68bd310, 0x20732c8, 0x4cc2400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 28686 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x4e7d0f01, 0x15)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xb, 0x1cc0705, 0xf, 0x5bd2620, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x651c4b0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29285 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x6f0fbc0, 0x6e60b10, 0x20732c8, 0x6f95760)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 29568 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5b06080, 0x4bc9848, 0x20732c8, 0x684fe20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 19569 [semacquire, 2 minutes]:
sync.runtime_Semacquire(0x6aad4ec)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x6aad4ec)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x6aad490)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x61c6900)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:362 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x61c6900, 0x7983640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:180 +0x694
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5a2f790, 0x61c6900, 0x7983640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 28702 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x84950a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x82c8900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x82c8900, 0x8d3c800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x78a46a0, 0x82c8900, 0x8d3c800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 26013 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x8b5ece0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x71d26c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x71d26c0, 0x845e940)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x713e9f0, 0x71d26c0, 0x845e940)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29402 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5b99c40, 0x59afa0c, 0x20, 0x20, 0x6db8be0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5b99b00, 0x20732c8, 0x5b99c40, 0x6259888, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5b99b00, 0x4f6d7c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x73b4000, 0x63c79fc, 0x4f6d79c, 0x59afb70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x764eb90, 0x63c79d0, 0x4f6d780, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x6ffaf50, 0x13, 0x1cac698, 0x4, 0x59afd4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x6ffaf50, 0x13, 0x526254c, 0x3, 0x3, 0xb7235601, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6a5ba20, 0x6def770, 0x5b1ab58, 0x0, 0x764ec80, 0x514c2a0, 0x1b9a5b0, 0x63c79d0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x6def770, 0x2073568, 0x5b99a20, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x73b4000, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x53b8620, 0x1828b00, 0x4f6d700, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48c9340, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x53b8620, 0x1828b00, 0x4f6d700, 0x68577a4, 0x48f65a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x60cb9e0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x5b99a00, 0x2052fd0, 0x53b8620, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x60cb9e0, 0x5a710c0, 0x1828b00, 0x594fdc0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 14804 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x63789c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7518fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7518fc0, 0x6e4e980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6fdc540, 0x7518fc0, 0x6e4e980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 21334 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7570600, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e5440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e5440, 0x7581400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x53a7530, 0x65e5440, 0x7581400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33940 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x9b1e6c0, 0x1cad6ef, 0x5, 0x9ec2dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 16687 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6273360, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6e058c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6e058c0, 0x6ed3600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5bcc6e0, 0x6e058c0, 0x6ed3600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 32731 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x6890d00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x73b4b40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x73b4b40, 0x7e9d8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6e40240, 0x73b4b40, 0x7e9d8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33247 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x832f680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 33217 [IO wait]:
internal/poll.runtime_pollWait(0xa497c040, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x89b6c44, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x89b6c30, 0x91de000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x89b6c30, 0x91de000, 0x10000, 0x10000, 0x72200, 0x7e4e001, 0x6b8cc01, 0x0, 0x29)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4e25080, 0x91de000, 0x10000, 0x10000, 0x497af34, 0x101, 0x497af08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4e25080, 0x91de000, 0x10000, 0x10000, 0x0, 0x1, 0x0, 0x101, 0x920dfc0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x8324cc0, 0x4e25080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 29385 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x4cc2400, 0x87e39f4, 0x20, 0x20, 0x1, 0x6abd920)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x4cc2280, 0x20732c8, 0x4cc2400, 0x68bd310, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x4cc2280, 0x62d2fc0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x73b4000, 0x680f818, 0x65c45dc, 0x87e3b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x6ffaff8, 0x680f800, 0x65c45d0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x6ffb048, 0x13, 0x1cac698, 0x4, 0x87e3d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x6ffb048, 0x13, 0x5d4cd4c, 0x3, 0x3, 0xabe16e01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6a5bbc0, 0x6def770, 0x5c151c8, 0x0, 0x764f220, 0x5a63020, 0x1b47298, 0x680f800, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x6def770, 0x2073568, 0x4b23f80, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x73b4000, 0x1cbfad5, 0xf, 0x1b47298, 0x61ccea0, 0x1828bc8, 0x65c45a0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48c9340, 0x1cbfad5, 0xf, 0x1b47298, 0x61ccea0, 0x1828bc8, 0x65c45a0, 0x65c4574, 0x48645a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x60cb9e8, 0x1, 0x0, 0xb2c97000, 0x8b, 0x4b23f20, 0x2052fe8, 0x61ccea0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x60cb9e8, 0x5a71100, 0x1828bc8, 0x5b5d1a0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 17970 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x60e7ee0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a8b440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a8b440, 0x6bb8600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4ec79f0, 0x5a8b440, 0x6bb8600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33523 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x9020420, 0x20001, 0x606330, 0x7bd6200, 0x5746714)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x70baf80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 22858 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7168820, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x82c9680, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x82c9680, 0x6b04180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x796f2d0, 0x82c9680, 0x6b04180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 17568 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x67caae0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6ded8c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6ded8c0, 0x6a397c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x64da090, 0x6ded8c0, 0x6a397c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 30671 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x822e000, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x72738c0, 0x59add09, 0x1b8a698, 0x8707f80, 0x1, 0x0, 0x8457fc0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:370 +0x120
github.com/hashicorp/consul/agent/consul.(*Server).getOrCreateAutopilotConfig(0x72738c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:921 +0x154
github.com/hashicorp/consul/agent/consul.(*Server).establishLeadership(0x72738c0, 0x2, 0x2)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:320 +0x118
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x72738c0, 0x91db700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:176 +0x624
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x9034720, 0x72738c0, 0x91db700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33900 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x9b2fd40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 10839 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5afcbe0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6843440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6843440, 0x68997c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x554c800, 0x6843440, 0x68997c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33249 [IO wait]:
internal/poll.runtime_pollWait(0xa5332548, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x89b6dd4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x89b6dc0, 0x920e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x89b6dc0, 0x920e000, 0x10000, 0x10000, 0x1924000, 0x1, 0x1, 0x0, 0x9c63600)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4e253e8, 0x920e000, 0x10000, 0x10000, 0x494ef34, 0x101, 0x494ef08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4e253e8, 0x920e000, 0x10000, 0x10000, 0x2020501, 0x405, 0x7229c, 0x494efa4, 0x494ef80)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x83251c0, 0x4e253e8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 29837 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x82b20c0, 0x57fba0c, 0x20, 0x20, 0x7ed6030, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x82b2020, 0x20732c8, 0x82b20c0, 0x601dc48, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x82b2020, 0x69c3000, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5ac5200, 0x816efec, 0x69c2fdc, 0x57fbb70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x7e4f1d0, 0x816efc0, 0x69c2fc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x71c5d28, 0x13, 0x1cac698, 0x4, 0x57fbd4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x71c5d28, 0x13, 0x5fc5d4c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6b725a0, 0x6081380, 0x75662c0, 0x0, 0x7e4f2c0, 0x7397540, 0x1b9a5b0, 0x816efc0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x6081380, 0x2073568, 0x7a9bf60, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5ac5200, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x839bf10, 0x1828b00, 0x69c2f80, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7500c60, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x839bf10, 0x1828b00, 0x69c2f80, 0x7412844, 0x48f63c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x7a38698, 0x1, 0x0, 0xb2c97000, 0x8b, 0x7a9bf40, 0x2052fd0, 0x839bf10, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x7a38698, 0x70c5120, 0x1828b00, 0x63bfec0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33905 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x99588f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 15332 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6a57840, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a8afc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a8afc0, 0x75a6300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6e8a110, 0x5a8afc0, 0x75a6300)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29863 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x7ac9fc0, 0x6c99a0c, 0x20, 0x20, 0x834dea0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x7ac9f20, 0x20732c8, 0x7ac9fc0, 0x80a7008, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x7ac9f20, 0x6a2b240, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x65e4240, 0x887802c, 0x6a2b1dc, 0x6c99b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x80dc7d0, 0x8878000, 0x6a2b1c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x601c598, 0x13, 0x1cac698, 0x4, 0x6c99d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x601c598, 0x13, 0x4e87d4c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x7274900, 0x5f98300, 0x72d9930, 0x0, 0x80dc8c0, 0x7a72280, 0x1b9a5b0, 0x8878000, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5f98300, 0x2073568, 0x7ac9e60, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x65e4240, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x8937500, 0x1828b00, 0x6a2b180, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x83822c0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x8937500, 0x1828b00, 0x6a2b180, 0x6bd35f4, 0x48643c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x601d8b0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x7ac9e40, 0x2052fd0, 0x8937500, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x601d8b0, 0x7eb52c0, 0x1828b00, 0x681ea40, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 11074 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x69c4560, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523b200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523b200, 0x5957dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4dc84e0, 0x523b200, 0x5957dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33537 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).retrySyncFullEventFn(0x94b0aa0, 0x0, 0x37)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:231 +0xd8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x94b0aa0, 0x1cbc90d, 0xd, 0x1cbc90d, 0xd)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:176 +0x60
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x94b0aa0, 0x1cb2cd8, 0x8, 0x76f1fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x94b0aa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 33245 [select]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x8af3c20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 12220 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x66a0b20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a8ad80, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a8ad80, 0x6592b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5bfe840, 0x5a8ad80, 0x6592b40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29618 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5b065c0, 0x4bc99c0, 0x20732c8, 0x68a83a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33241 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x852eb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 23917 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xe9a86a18, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xa, 0x1cc0705, 0xf, 0x5bd2540, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x490a948, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29408 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x184faa2c, 0xe)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xb, 0x1cc0705, 0xf, 0x5bd2740, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x4e67538, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29572 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x8c22050, 0x1cc0a02, 0xf, 0x2052fe8, 0x61cd080, 0x1, 0x0, 0x1828bc8, 0x6506690, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x8c22050, 0x20732c8, 0x60a8ae0, 0x1cc0a02, 0xf, 0x2052fe8, 0x61cd080, 0x1cb690f, 0xa, 0x5ab5a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 22058 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6b23b20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x61c6000, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x61c6000, 0x6cc2d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x70110f0, 0x61c6000, 0x6cc2d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33971 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x7501e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 19406 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6e81020, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523bb00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523bb00, 0x79b7ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4b661f0, 0x523bb00, 0x79b7ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 21553 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5a7df40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x7d8db90, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x7d8db90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 29159 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x69ef1edd, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xa, 0x1cc0705, 0xf, 0x510a180, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x8aff118, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 21216 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6d53b60, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x523afc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523afc0, 0x7532240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6b322f0, 0x523afc0, 0x7532240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 20053 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4c54540, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e5d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e5d40, 0x6821840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6d7b120, 0x65e5d40, 0x6821840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 30682 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x86257c0, 0x1cc0a02, 0xf, 0x2052fe8, 0x822e300, 0x1, 0x0, 0x1828bc8, 0x5ebc000, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x86257c0, 0x20732c8, 0x837c420, 0x1cc0a02, 0xf, 0x2052fe8, 0x822e300, 0x1cb690f, 0xa, 0x7b23b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 11419 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6891d80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4e37200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4e37200, 0x59df540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5d2a140, 0x4e37200, 0x59df540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33138 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x88ced20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 29838 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x69c3000, 0x601dc48, 0x20732c8, 0x82b20c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33957 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x99588f0, 0xbebc200, 0x0, 0x9f08500, 0x9ed2c40, 0x9f06420)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 34027 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x7a8f5cf2, 0x0)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0x4, 0x1cc0705, 0xf, 0x9f8db80, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0xa0261d8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33956 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x99588f0, 0x9ed2c40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:152 +0xe0
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 30707 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x7ac7200, 0x8b395a8, 0x20732c8, 0x8437aa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33968 [IO wait]:
internal/poll.runtime_pollWait(0xa5332b78, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9c1a834, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x9c1a820, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x9c1a820, 0x8e00000, 0x4b03744, 0x4d198c)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9b35d30, 0x0, 0x32c99b8, 0x1827b38)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x9b35d30, 0x1, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x99cd200, 0x206b9a8, 0x9b35d30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 32220 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x2b534281, 0x16)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xc, 0x1cc0705, 0xf, 0x5884160, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x7a1b3f8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33917 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x9b66e70, 0x2a05f200, 0x1, 0x9e8db40, 0x9c190c0, 0x9ae1fd0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:128 +0xc8
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 33621 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x95b7800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x95b7800, 0x99c04b0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 13428 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6df03c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4e36b40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4e36b40, 0x6597ec0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5c942b0, 0x4e36b40, 0x6597ec0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33955 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x99588f0, 0x3b9aca00, 0x0, 0x9f084c0, 0x9ed2c40, 0x9f06410)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 33246 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x832f680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 31950 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x986e1e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x99dc900, 0x98a0400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:168 +0x4a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8d620c0, 0x99dc900, 0x98a0400)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33264 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x7814240, 0x0, 0x1d9d4d8, 0x852eb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:497 +0xcfc

goroutine 33939 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x9b1e6c0, 0x1cad4b5, 0x5, 0x9ec2da0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 28569 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4927a40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x77626c0, 0x0, 0x1b0f3b0, 0x7dc4960, 0x0, 0x0, 0x0, 0x1849788)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:370 +0x120
github.com/hashicorp/consul/agent/consul.(*Catalog).Register(0x84e6b40, 0x7dc4960, 0x32c99b8, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go:142 +0x2e0
reflect.Value.call(0x4aab200, 0x84e6ba0, 0x13, 0x1cac698, 0x4, 0x6c9dd80, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab200, 0x84e6ba0, 0x13, 0x6c9dd80, 0x3, 0x3, 0x85b4e201, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5985440, 0x4d71830, 0x6f4a488, 0x0, 0x7e88dc0, 0x6df0720, 0x1b0f3b0, 0x7dc4960, 0x16, 0x1849788, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4d71830, 0x2073568, 0x6a3e680, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x77626c0, 0x1cc1949, 0x10, 0x1b0f3b0, 0x7dc4910, 0x1849788, 0x32c99b8, 0x12b94, 0x1b874)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent/local.(*State).syncNodeInfo(0x7e81dd0, 0x5a526c0, 0x6c9def4)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/local/state.go:1478 +0x118
github.com/hashicorp/consul/agent/local.(*State).SyncChanges(0x7e81dd0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/local/state.go:1275 +0x3a0
github.com/hashicorp/consul/agent/local.(*State).SyncFull(0x7e81dd0, 0x1cb2c00, 0x8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/local/state.go:1218 +0x44
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x7d89360, 0x1cb2cd8, 0x8, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:167 +0x3c4
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x7d89360, 0x1cb2cd8, 0x8, 0x6c9dfdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x7d89360)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 34031 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0xa022f40, 0xa0274a0, 0x20732c8, 0xa07cf60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33271 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x8ae02c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 33916 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x9b66e70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 33243 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x852eb40, 0x1cad4b5, 0x5, 0x89efac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 26198 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x8719b20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8541440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8541440, 0x802c000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8ba3490, 0x8541440, 0x802c000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33689 [select]:
github.com/hashicorp/yamux.(*Session).send(0x96ccee0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 28355 [select]:
github.com/hashicorp/consul/api/watch.(*Plan).RunWithClientAndLogger(0x66a8150, 0x6736ee0, 0x68b7fb0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/api/watch/plan.go:89 +0x498
github.com/hashicorp/consul/api/watch.(*Plan).RunWithConfig(0x66a8150, 0x5884120, 0x17, 0x66a8bd0, 0xf, 0x5884120)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/api/watch/plan.go:49 +0x1a8
github.com/hashicorp/consul/agent.(*Agent).reloadWatches.func1(0x708d340, 0x66a8bd0, 0x66a8150)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1057 +0x11c
created by github.com/hashicorp/consul/agent.(*Agent).reloadWatches
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1041 +0xc00

goroutine 31474 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xfdb5d79a, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xb, 0x1cc0705, 0xf, 0x638d3a0, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x549cd58, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29462 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x8399180, 0x1cc0714, 0xf, 0x2052fd0, 0x5a6d180, 0x1, 0x0, 0x1828b00, 0x594fcc0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x8399180, 0x20732c8, 0x5aead80, 0x1cc0714, 0xf, 0x2052fd0, 0x5a6d180, 0x1cae1b2, 0x5, 0x58b9740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 27632 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x91e07455, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xa, 0x1cc0705, 0xf, 0x54c8ae0, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x64aa620, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 14089 [select, 3 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5516440, 0x651c0f8, 0x20732c8, 0x51ec600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 29161 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x7d88fa0, 0x1cc0714, 0xf, 0x2052fd0, 0x821ff10, 0x1, 0x0, 0x1828b00, 0x6ecb680, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x7d88fa0, 0x20732c8, 0x75ab080, 0x1cc0714, 0xf, 0x2052fd0, 0x821ff10, 0x1cae1b2, 0x5, 0x6ecb580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 34043 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x9ea7310)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:711 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 28307 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x62bd440, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x639d200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x639d200, 0x7972f80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6242230, 0x639d200, 0x7972f80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33259 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x852f200, 0x1cad4b5, 0x5, 0x8af3ce0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 33963 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x99cd200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 20152 [semacquire, 2 minutes]:
sync.runtime_Semacquire(0x62ff01c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x62ff01c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x62fefc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x5ac46c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:362 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac46c0, 0x64de7c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:180 +0x694
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6fdd6f0, 0x5ac46c0, 0x64de7c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 27875 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7a72140, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8c526c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8c526c0, 0x8b42a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8d63600, 0x8c526c0, 0x8b42a40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 12920 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x68a86a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6843d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6843d40, 0x6903000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6a55410, 0x6843d40, 0x6903000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33918 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x9b66e70, 0x9c190c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:152 +0xe0
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 21552 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7ca2c80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*AutopilotDelegate).PromoteNonVoters(0x7d64f10, 0x6d7c728, 0x0, 0x0, 0x0, 0x0, 0x0, 0x71efec0, 0x5fbf7b4, 0x6e5e340, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot.go:69 +0x40
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).promoteServers(0x7d8db90, 0x5fbf744, 0x3)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:140 +0x120
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x7d8db90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:112 +0x198
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 33266 [IO wait]:
internal/poll.runtime_pollWait(0xa5332968, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x8c9ad84, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x8c9ad70, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x8c9ad70, 0x347690, 0x32b5a1c, 0x48a03f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x7ea2ae0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x7ea2ae0, 0x2, 0x2, 0x3f800000, 0x8e85be0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x7814240, 0x206b9a8, 0x7ea2ae0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 33901 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x9b2fd40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 33253 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x70dd4a0, 0x3b9aca00, 0x0, 0x83254c0, 0x90549c0, 0x4e256a8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 33967 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x9f4d980, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x99cd200, 0x9f06d13, 0x1828358, 0x9f4fe80, 0x20, 0x2, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:370 +0x120
github.com/hashicorp/consul/agent/consul.(*Server).initializeACLs(0x99cd200, 0x1, 0x9fc9201, 0x1)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:520 +0xd48
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x99cd200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:106 +0x45c
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:512 +0xae0

goroutine 33236 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x70dd3f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 13071 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x63c5d60, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6e05d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6e05d40, 0x6ab4700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5418500, 0x6e05d40, 0x6ab4700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 11132 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x65b9080, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6843200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6843200, 0x6821280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x67a6600, 0x6843200, 0x6821280)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 13102 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5a53260, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e5680, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e5680, 0x6ab5e80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x53a6a60, 0x65e5680, 0x6ab5e80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33235 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x70dd3f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 16280 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x70c1c40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e4900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e4900, 0x6bb9240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4b67cf0, 0x65e4900, 0x6bb9240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29836 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x7e93980, 0x8679a0c, 0x20, 0x20, 0x8283e10, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x7e938e0, 0x20732c8, 0x7e93980, 0x835a5c0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x7e938e0, 0x68ab5c0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5ac5200, 0x81e9adc, 0x68ab59c, 0x8679b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x7e4f1d0, 0x81e9ab0, 0x68ab580, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x71c5d28, 0x13, 0x1cac698, 0x4, 0x8679d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x71c5d28, 0x13, 0x5d53d4c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6b725a0, 0x6081380, 0x7981420, 0x0, 0x7e4f2c0, 0x7397440, 0x1b9a5b0, 0x81e9ab0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x6081380, 0x2073568, 0x7e93820, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5ac5200, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x839bc00, 0x1828b00, 0x68ab540, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7500c60, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x839bc00, 0x1828b00, 0x68ab540, 0x7569cb4, 0x48f61e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x7a38698, 0x1, 0x0, 0xb2c97000, 0x8b, 0x7e93800, 0x2052fd0, 0x839bc00, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x7a38698, 0x70c5120, 0x1828b00, 0x63bfdc0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 22936 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xdd98f005, 0x1a)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xa, 0x1cc0705, 0xf, 0x51ebb00, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x6d4f9f0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33965 [select]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x9b3cb40, 0x9bde5c0, 0x1cac33a, 0x3, 0x9c18d00, 0x9bde540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 22128 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7b2a500, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a8b200, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a8b200, 0x76057c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6e8a460, 0x5a8b200, 0x76057c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33625 [runnable]:
syscall.Syscall(0x76, 0x94, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fsync(0x94, 0x12b01, 0x12ecc)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:449 +0x30
internal/poll.(*FD).Fsync(0x974f000, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_fsync_posix.go:17 +0x88
os.(*File).Sync(0x99c04d0, 0x0, 0x0)
	/usr/lib/go-1.13/src/os/file_posix.go:113 +0x40
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x9032a20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:311 +0x400
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 27750 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x8c02020, 0x6287e84, 0x20, 0x20, 0x29, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6287f9c, 0x20732c8, 0x8c02020, 0x89526c0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6287f9c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).trackAutoEncryptCARoots(0x8def440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:551 +0x16c
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:433 +0xfb4

goroutine 25506 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xbafefb15, 0x18)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0xb, 0x1cc0705, 0xf, 0x4da36c0, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x6d82128, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 30931 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x7202060, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8b58480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8b58480, 0x7e9db40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7ba8240, 0x8b58480, 0x7e9db40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33215 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x832ed80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 11640 [select, 4 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x62bb700, 0x595dcb0, 0x20732c8, 0x6890c00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33912 [IO wait]:
internal/poll.runtime_pollWait(0xa5332020, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9ea6424, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x9ea6410, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x9ea6410, 0x3, 0x3, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9b1d710, 0x5c0ecf4, 0x51fb534, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x9b1d710, 0x12b94, 0x1ba48, 0x9915470)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x9e8d9c0, 0x9b1d710)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 33962 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x9b1ec60, 0x1cad6ef, 0x5, 0x9ed1d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 29399 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x6eacd80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x73b4000, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x73b4000, 0x7bc3f80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x51d3790, 0x73b4000, 0x7bc3f80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33210 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x86a9200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x86a9200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x86a9200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x86a9200, 0x4e24ff8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 18257 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6b554e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6842fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6842fc0, 0x7604500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x67a6870, 0x6842fc0, 0x7604500)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 17091 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6ac5dc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e58c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e58c0, 0x5b96680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5acd450, 0x65e58c0, 0x5b96680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 14088 [select, 3 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x51ec600, 0x65f8974, 0x20, 0x20, 0x100, 0x6)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x51b8320, 0x20732c8, 0x51ec600, 0x651c0f8, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x51b8320, 0x5516440, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x6f138c0, 0x5989638, 0x65a33bc, 0x65f8b50, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Health).ServiceNodes(0x64ad860, 0x59895e0, 0x65a33b0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/health_endpoint.go:185 +0x188
reflect.Value.call(0x4aab7c0, 0x64ad8a0, 0x13, 0x1cac698, 0x4, 0x65f8d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab7c0, 0x64ad8a0, 0x13, 0x571ed4c, 0x3, 0x3, 0x63f19e01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x6efb940, 0x630fce0, 0x6517458, 0x0, 0x6a7c460, 0x6f8b600, 0x1b9a700, 0x59895e0, 0x16, 0x1828b28, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x630fce0, 0x2073568, 0x51b8280, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x6f138c0, 0x1cc8efe, 0x13, 0x1b9a700, 0x4b51360, 0x1828b28, 0x65a3380, 0xe0faf4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4e711e0, 0x1cc8efe, 0x13, 0x1b9a700, 0x4b51360, 0x1828b28, 0x65a3380, 0x65a3354, 0x48f6780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*HealthServices).Fetch(0x6c4aaa8, 0xc, 0x0, 0xb2c97000, 0x8b, 0x52dbfe0, 0x2053048, 0x4b51360, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/health_services.go:41 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7d8, 0x6c4aaa8, 0x5b45d40, 0x1828b28, 0x602cc60, 0x0, 0x0, 0x0, 0x0, 0xc, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 28063 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x1d0c86c4, 0x16)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5c35590, 0x4da3200, 0x9, 0x1cc0705, 0xf, 0x736c3a0, 0x20, 0x2052e80, 0x5a4d860)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x5ab4a80, 0x4da3200, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x5b8e1a8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 30706 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x8437aa0, 0x5714a0c, 0x20, 0x20, 0x873b0e0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x8437a00, 0x20732c8, 0x8437aa0, 0x8b395a8, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x8437a00, 0x7ac7200, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x8d84240, 0x8cdee9c, 0x7ac71dc, 0x5714b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x9374960, 0x8cdee70, 0x7ac71c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x9308f10, 0x13, 0x1cac698, 0x4, 0x5714d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x9308f10, 0x13, 0x4dccd4c, 0x3, 0x3, 0x8d4c0801, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x935ce80, 0x8ca3d70, 0x782a348, 0x0, 0x9374a50, 0x8e09700, 0x1b9a5b0, 0x8cdee70, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x8ca3d70, 0x2073568, 0x8437940, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x8d84240, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x8cded90, 0x1828b00, 0x7ac7180, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x8383b80, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x8cded90, 0x1828b00, 0x7ac7180, 0x8450de4, 0x48f6000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x8e848f0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x8437920, 0x2052fd0, 0x8cded90, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x8e848f0, 0x8f47580, 0x1828b00, 0x7ac7100, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 28325 [select, 1 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x82f3ef0, 0x1cc0a02, 0xf, 0x2052fe8, 0x61ccae0, 0x1, 0x0, 0x1828bc8, 0x4a8c720, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x82f3ef0, 0x20732c8, 0x65b4280, 0x1cc0a02, 0xf, 0x2052fe8, 0x61ccae0, 0x1cb690f, 0xa, 0x54da680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 28218 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5a74800, 0x5db1e38, 0x20732c8, 0x65242c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 11643 [select, 4 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x68912c0, 0x6246a0c, 0x20, 0x20, 0x65288e0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6891120, 0x20732c8, 0x68912c0, 0x595dd50, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6891120, 0x62bb840, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4e37200, 0x62a713c, 0x62bb81c, 0x6246b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x63bd810, 0x62a7110, 0x62bb800, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x60cc2f0, 0x13, 0x1cac698, 0x4, 0x6246d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x60cc2f0, 0x13, 0x4f35d4c, 0x3, 0x3, 0x4d343c01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x60de9a0, 0x5fda300, 0x5a2ec38, 0x0, 0x63bd900, 0x51db9a0, 0x1b9a5b0, 0x62a7110, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5fda300, 0x2073568, 0x6890fa0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4e37200, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x62a7030, 0x1828b00, 0x62bb7c0, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4e71760, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x62a7030, 0x1828b00, 0x62bb7c0, 0x59f57a4, 0x48645a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x60cd820, 0x1, 0x0, 0xb2c97000, 0x8b, 0x6890f20, 0x2052fd0, 0x62a7030, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x60cd820, 0x5cd5b40, 0x1828b00, 0x62bb740, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33814 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xb20f4106, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xb, 0x1cc0705, 0xf, 0x57dc5e0, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x601ddd8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33942 [IO wait]:
internal/poll.runtime_pollWait(0xa497c1cc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9ea65b4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x9ea65a0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x9ea65a0, 0x1b5c4c0, 0x4bb2b40, 0xb6d99600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9b1dee0, 0xd4, 0x18, 0x9f71a60)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x9b1dee0, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x9b1dee0, 0x20, 0x1b5c4c0, 0x310001, 0x9f71a60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x9b24d80, 0x2064b48, 0x9f54058, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x7501e40, 0x9c19340, 0x9f1ce60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 22672 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7ca13a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7dda000, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7dda000, 0x74904c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7510340, 0x7dda000, 0x74904c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 28920 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x8072ba0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8d85440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8d85440, 0x8614000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x9034820, 0x8d85440, 0x8614000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 10673 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x60bea60, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5c466c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c466c0, 0x5d29240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5b93520, 0x5c466c0, 0x5d29240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29864 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x6a2b240, 0x80a7008, 0x20732c8, 0x7ac9fc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33969 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x99cd200)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 33688 [IO wait]:
internal/poll.runtime_pollWait(0xa5332440, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9605194, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x9605180, 0x9846000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x9605180, 0x9846000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x8e85be0, 0x9846000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x951eff0, 0x8f70144, 0xc, 0xc, 0x96ccf50, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x951eff0, 0x8f70144, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x96ccee0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x96ccee0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 30686 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x837cca0, 0x79169f4, 0x20, 0x20, 0x1, 0x7ed6c20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x837cac0, 0x20732c8, 0x837cca0, 0x8dcef28, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x837cac0, 0x7b23bc0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x8d84240, 0x822e3d8, 0x5ebc0fc, 0x7916b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x9308fa0, 0x822e3c0, 0x5ebc0f0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x9308ff0, 0x13, 0x1cac698, 0x4, 0x7916d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x9308ff0, 0x13, 0x4edbd4c, 0x3, 0x3, 0x8d4c9901, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x935d000, 0x8ca3d70, 0x8c1bd28, 0x0, 0x9374d70, 0x84377a0, 0x1b47298, 0x822e3c0, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x8ca3d70, 0x2073568, 0x837c8a0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x8d84240, 0x1cbfad5, 0xf, 0x1b47298, 0x822e300, 0x1828bc8, 0x5ebc0c0, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x8383b80, 0x1cbfad5, 0xf, 0x1b47298, 0x822e300, 0x1828bc8, 0x5ebc0c0, 0x5ebc094, 0x4bb34a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x8e84900, 0x1, 0x0, 0xb2c97000, 0x8b, 0x837c880, 0x2052fe8, 0x822e300, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x8e84900, 0x8f475c0, 0x1828bc8, 0x5ebc000, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 11382 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x63c56e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e4fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e4fc0, 0x5bf9840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x484b0a0, 0x65e4fc0, 0x5bf9840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29629 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x7e4e050, 0x1cc0714, 0xf, 0x2052fd0, 0x839bc00, 0x1, 0x0, 0x1828b00, 0x63bfdc0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x7e4e050, 0x20732c8, 0x73972e0, 0x1cc0714, 0xf, 0x2052fd0, 0x839bc00, 0x1cae1b2, 0x5, 0x63bfc00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33139 [runnable]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x7ea2ad0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 27945 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x79959e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8c6a6c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8c6a6c0, 0x8960e80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8d63d80, 0x8c6a6c0, 0x8960e80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33622 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x95b7800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x95b7800, 0x99c04b8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 31222 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x65a0ee0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4a7bb00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a7bb00, 0x8c74000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x53a6e10, 0x4a7bb00, 0x8c74000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33258 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x852f200, 0x1cae9dd, 0x6, 0x8af3cc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 33808 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x9b3cc90, 0x20001, 0x9038310, 0x1828c90, 0x756e060)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x8776600)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 9590 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6241ee0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x639c240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x639c240, 0x5964dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5c95d80, 0x639c240, 0x5964dc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33224 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x88cecd0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x88cecd0, 0x1cb8913, 0xb, 0x1cb8913, 0xb)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x88cecd0, 0x1cb2cd8, 0x8, 0x7690fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x88cecd0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 33267 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x7814240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:133 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:523 +0xb9c

goroutine 27634 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x8972840, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7815680, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7815680, 0x8297d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7e86820, 0x7815680, 0x8297d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 32856 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x6e7a900, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6f126c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6f126c0, 0x91da440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x64dbaf0, 0x6f126c0, 0x91da440)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 28217 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x65242c0, 0x63809f4, 0x20, 0x20, 0x1, 0x65ac180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6524180, 0x20732c8, 0x65242c0, 0x5db1e38, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6524180, 0x5a74800, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4a7b440, 0x72618d8, 0x4a8c96c, 0x6380b7c, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x4d7fdb8, 0x72618c0, 0x4a8c960, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4aab900, 0x4d7fe08, 0x13, 0x1cac698, 0x4, 0x6380d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab900, 0x4d7fe08, 0x13, 0x5f8d54c, 0x3, 0x3, 0xb96cd401, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x7504a60, 0x5cad5f0, 0x6f055c8, 0x0, 0x729f4f0, 0x72bdb40, 0x1b47298, 0x72618c0, 0x16, 0x1828bc8, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5cad5f0, 0x2073568, 0x65240a0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4a7b440, 0x1cbfad5, 0xf, 0x1b47298, 0x61ccae0, 0x1828bc8, 0x4a8c870, 0xe0fce4, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7501ce0, 0x1cbfad5, 0xf, 0x1b47298, 0x61ccae0, 0x1828bc8, 0x4a8c870, 0x4a8c844, 0x48f6000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x50affb8, 0x1, 0x0, 0xb2c97000, 0x8b, 0x6524080, 0x2052fe8, 0x61ccae0, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7f0, 0x50affb8, 0x5b45380, 0x1828bc8, 0x4a8c720, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33260 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x852f200, 0x1cad6ef, 0x5, 0x8af3d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 18878 [semacquire, 2 minutes]:
sync.runtime_Semacquire(0x52d3b0c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x52d3b0c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x52d3ab0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x523b680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:362 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x523b680, 0x7152040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:180 +0x694
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6deb1d0, 0x523b680, 0x7152040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 18088 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6afd2a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5ac5680, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac5680, 0x7077d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x67108b0, 0x5ac5680, 0x7077d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29287 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x6f0fd80, 0x6e60b68, 0x20732c8, 0x6e0a3c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 28209 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x67ca460, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x4a7b440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a7b440, 0x79ad640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6f045f0, 0x4a7b440, 0x79ad640)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 19622 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x79dedc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x6aad490, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x6aad490)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 26543 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x6d5b140, 0x793297c, 0x20, 0x20, 0x1d1d0, 0x1affc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6d5ad60, 0x20732c8, 0x6d5b140, 0x51bb5e0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6d5ad60, 0x6a86dc0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x71d2b40, 0x598f1d8, 0x5168a2c, 0x7932b50, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Catalog).ServiceNodes(0x7d11688, 0x598f180, 0x5168a20, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go:348 +0x190
reflect.Value.call(0x4aab240, 0x7d116f8, 0x13, 0x1cac698, 0x4, 0x7932d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab240, 0x7d116f8, 0x13, 0x56d254c, 0x3, 0x3, 0x9bb10a01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x8337aa0, 0x84a5110, 0x6557bc8, 0x0, 0x7edd720, 0x797e020, 0x1b9a700, 0x598f180, 0x16, 0x1828ce0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x84a5110, 0x2073568, 0x6d5aca0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x71d2b40, 0x1ccb2fd, 0x14, 0x1b9a700, 0x59cd720, 0x1828ce0, 0x51689c0, 0xe0d610, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7eb8b00, 0x1ccb2fd, 0x14, 0x1b9a700, 0x59cd720, 0x1828ce0, 0x51689c0, 0x5168994, 0x48645a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*CatalogServices).Fetch(0x84aee90, 0x7, 0x0, 0xb2c97000, 0x8b, 0x6d5ac20, 0x2053048, 0x59cd720, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/catalog_services.go:41 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b790, 0x84aee90, 0x6464220, 0x1828ce0, 0x6824d50, 0x0, 0x0, 0x0, 0x0, 0x7, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 11639 [select, 4 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x6890c00, 0x66fba0c, 0x20, 0x20, 0x65287a0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6890b00, 0x20732c8, 0x6890c00, 0x595dcb0, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6890b00, 0x62bb700, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4e37200, 0x62a6fec, 0x62bb6dc, 0x66fbb70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x63bd810, 0x62a6fc0, 0x62bb6c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x60cc2f0, 0x13, 0x1cac698, 0x4, 0x66fbd4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x60cc2f0, 0x13, 0x56d554c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x60de9a0, 0x5fda300, 0x5a2eaf0, 0x0, 0x63bd900, 0x516d4a0, 0x1b9a5b0, 0x62a6fc0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5fda300, 0x2073568, 0x68909e0, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4e37200, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x62a6e70, 0x1828b00, 0x62bb680, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x4e71760, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x62a6e70, 0x1828b00, 0x62bb680, 0x59f5624, 0x48645a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x60cd820, 0x1, 0x0, 0xb2c97000, 0x8b, 0x68909c0, 0x2052fd0, 0x62a6e70, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x60cd820, 0x5cd5b40, 0x1828b00, 0x62bb600, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29595 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x6bc1800, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5c478c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c478c0, 0x895d980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6710950, 0x5c478c0, 0x895d980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33660 [IO wait]:
internal/poll.runtime_pollWait(0xa5332a70, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9604b54, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x9604b40, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x9604b40, 0x7732c10, 0xffffffff, 0x7229c)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x99c7ed0, 0x0, 0x1, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x99c7ed0, 0x1, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x8d85680, 0x206b9a8, 0x99c7ed0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:515 +0xb0c

goroutine 33534 [IO wait]:
internal/poll.runtime_pollWait(0xa497c358, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x96050a4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x9605090, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x9605090, 0x1b5c4c0, 0x4bb2b40, 0xb6d99600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9b88c00, 0xc2, 0x18, 0x9809820)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x9b88c00, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x9b88c00, 0x20, 0x1b5c4c0, 0x310001, 0x9809820)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:895 +0x1c
net/http.(*Server).Serve(0x8fe1c20, 0x2064b48, 0x8e853c0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x214
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x8e8a000, 0x961dbc0, 0x9750c20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:934 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:931 +0x78

goroutine 21148 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6a220e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6dedb00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6dedb00, 0x6c7ed00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x65f3210, 0x6dedb00, 0x6c7ed00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33272 [IO wait]:
internal/poll.runtime_pollWait(0xa497bf38, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x8d52564, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x8d52550, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x8d52550, 0x4bb2000, 0xb6d996d0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x7173ae0, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x7173ae0, 0x7a38810, 0x8d52550, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x677ba80, 0x206b9a8, 0x7173ae0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x677ba80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5ebd170, 0x1cac2e9, 0x3, 0x7010dd0, 0xf, 0x7173ad0, 0x7229c, 0x49fe7b4)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x8ae02c0, 0x5ebd170, 0x8325680, 0x83256c0, 0x205db30, 0x7ae1020)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 22073 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6d9c2a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5ac4240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac4240, 0x70c3000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x65f29a0, 0x5ac4240, 0x70c3000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29930 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x7e8d660, 0x5e1ea0c, 0x20, 0x20, 0x83729a0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x7e8d5c0, 0x20732c8, 0x7e8d660, 0x835aa68, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x7e8d5c0, 0x69cb300, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x65e4240, 0x8dda1ec, 0x69cb2dc, 0x5e1eb70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x80dc7d0, 0x8dda1c0, 0x69cb2c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x601c598, 0x13, 0x1cac698, 0x4, 0x5e1ed4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x601c598, 0x13, 0x4d4954c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x7274900, 0x5f98300, 0x79819c0, 0x0, 0x80dc8c0, 0x79e47e0, 0x1b9a5b0, 0x8dda1c0, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5f98300, 0x2073568, 0x7e8d500, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x65e4240, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x81e8d90, 0x1828b00, 0x69cb280, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x83822c0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x81e8d90, 0x1828b00, 0x69cb280, 0x720ae14, 0x48f61e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x601d8b0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x7e8d4e0, 0x2052fd0, 0x81e8d90, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x601d8b0, 0x7eb52c0, 0x1828b00, 0x5c61dc0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33254 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x70dd4a0, 0x90549c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:152 +0xe0
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 33263 [select]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x721a0f0, 0x6df5880, 0x1cac33a, 0x3, 0x90540c0, 0x6df57c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:488 +0xcc8

goroutine 9467 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5e5d920, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5c46900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5c46900, 0x5922800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6517130, 0x5c46900, 0x5922800)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29392 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x62d3c40, 0x68bd4e0, 0x20732c8, 0x5b69180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 29286 [select]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x6e0a3c0, 0x6c9ea0c, 0x20, 0x20, 0x6a584d0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x6e0a300, 0x20732c8, 0x6e0a3c0, 0x6e60b68, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x6e0a300, 0x6f0fd80, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x7ddab40, 0x7f118ac, 0x6f0fd5c, 0x6c9eb70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x7760640, 0x7f11880, 0x6f0fd40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x75544d0, 0x13, 0x1cac698, 0x4, 0x6c9ed4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x75544d0, 0x13, 0x4ed654c, 0x3, 0x3, 0x3797db01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x7ea8920, 0x7b07740, 0x74ef068, 0x0, 0x77609b0, 0x75ab1e0, 0x1b9a5b0, 0x7f11880, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x7b07740, 0x2073568, 0x6e0a220, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x7ddab40, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x821ff10, 0x1828b00, 0x6f0fd00, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7eb9760, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x821ff10, 0x1828b00, 0x6f0fd00, 0x6f1fb34, 0x4ba01e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x6ffa4c8, 0x1, 0x0, 0xb2c97000, 0x8b, 0x6e0a200, 0x2052fd0, 0x821ff10, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x6ffa4c8, 0x6464440, 0x1828b00, 0x6ecb680, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29926 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x68ab5c0, 0x835a5c0, 0x20732c8, 0x7e93980)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 28329 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x65b4860, 0x7ae6a0c, 0x20, 0x20, 0x5b20bb0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x65b4700, 0x20732c8, 0x65b4860, 0x5860ed8, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x65b4700, 0x54dac00, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4a7b440, 0x53b8c6c, 0x54dabdc, 0x7ae6b70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x729f0e0, 0x53b8c40, 0x54dabc0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x4d7fd28, 0x13, 0x1cac698, 0x4, 0x7ae6d4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x4d7fd28, 0x13, 0x5fc4d4c, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x75048c0, 0x5cad5f0, 0x4df9f50, 0x0, 0x729f1d0, 0x65b4620, 0x1b9a5b0, 0x53b8c40, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5cad5f0, 0x2073568, 0x65b4600, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4a7b440, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x53b8690, 0x1828b00, 0x54dab80, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x7501ce0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x53b8690, 0x1828b00, 0x54dab80, 0x68b6244, 0x48f61e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x50affb0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x65b45c0, 0x2052fd0, 0x53b8690, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x50affb0, 0x5b45340, 0x1828b00, 0x54daa40, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 33941 [select]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x9f1cb80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 33686 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x96ccd90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 32258 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x8427e80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x9d3e6c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x9d3e6c0, 0x8ac0700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x9640940, 0x9d3e6c0, 0x8ac0700)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 19289 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5b2d5c0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6f13b00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6f13b00, 0x6b951c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x64da670, 0x6f13b00, 0x6b951c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 28211 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x610c640, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6842b40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6842b40, 0x80fd580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6a556d0, 0x6842b40, 0x80fd580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 26544 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x6a86dc0, 0x51bb5e0, 0x20732c8, 0x6d5b140)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 33261 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x7814240)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:477 +0x9b8

goroutine 10384 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x629aca0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a51d40, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a51d40, 0x5bd7b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5af6c90, 0x5a51d40, 0x5bd7b00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8983 [chan send, 4 minutes]:
testing.tRunner.func1(0x5beb0e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x5beb0e0, 0x1d9d188)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8982 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5beb040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestMakeWatchHandler(0x5beb040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/watch_handler_test.go:15 +0x20
testing.tRunner(0x5beb040, 0x1d9d18c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8973 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6f180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSetFilePermissions(0x4d6f180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/util_test.go:28 +0x1c
testing.tRunner(0x4d6f180, 0x1d9d2f8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8972 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6f0e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestStringHash(0x4d6f0e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/util_test.go:18 +0x1c
testing.tRunner(0x4d6f0e0, 0x1d9d320)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8971 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6f040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUserEventToken(0x4d6f040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:186 +0x20
testing.tRunner(0x4d6f040, 0x1d9d35c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8970 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6efa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestFireReceiveEvent(0x4d6efa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:150 +0x1c
testing.tRunner(0x4d6efa0, 0x1d9d068)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 29398 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x91b134e7, 0x12)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x5bfa370, 0x5a7f5a0, 0xb, 0x1cc0705, 0xf, 0x5884d60, 0x20, 0x2052e80, 0x5bd12f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:676 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7a8, 0x4b58080, 0x5a7f5a0, 0x0, 0x0, 0x0, 0x0, 0x20528f8, 0x60ca3d8, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:616 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 8968 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6ee60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestShouldProcessUserEvent(0x4d6ee60)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:50 +0x20
testing.tRunner(0x4d6ee60, 0x1d9d30c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8967 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6edc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestValidateUserEventParams(0x4d6edc0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:13 +0x1c
testing.tRunner(0x4d6edc0, 0x1d9d360)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 21714 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6803540, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7dda240, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7dda240, 0x7970b80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7176950, 0x7dda240, 0x7970b80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 28561 [semacquire, 1 minutes]:
sync.runtime_SemacquireMutex(0x7e81dd4, 0x180200, 0x1)
	/usr/lib/go-1.13/src/runtime/sema.go:71 +0x34
sync.(*Mutex).lockSlow(0x7e81dd0)
	/usr/lib/go-1.13/src/sync/mutex.go:138 +0x218
sync.(*Mutex).Lock(0x7e81dd0)
	/usr/lib/go-1.13/src/sync/mutex.go:81 +0x4c
sync.(*RWMutex).Lock(0x7e81dd0)
	/usr/lib/go-1.13/src/sync/rwmutex.go:98 +0x20
github.com/hashicorp/consul/agent/local.(*State).StopNotify(0x7e81dd0, 0x7fc02c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/local/state.go:956 +0x20
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x649fc50, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:120 +0x104
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x61526e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 11647 [select, 4 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x62bb880, 0x595dfc8, 0x20732c8, 0x6891be0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 29570 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x8c22050, 0x1cc0714, 0xf, 0x2052fd0, 0x5a6d9d0, 0x1, 0x0, 0x1828b00, 0x5ab5b80, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x8c22050, 0x20732c8, 0x60a8ae0, 0x1cc0714, 0xf, 0x2052fd0, 0x5a6d9d0, 0x1cae1b2, 0x5, 0x5ab5a00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 8969 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6ef00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIngestUserEvent(0x4d6ef00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:119 +0x20
testing.tRunner(0x4d6ef00, 0x1d9d124)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 33913 [IO wait]:
internal/poll.runtime_pollWait(0xa5331f18, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9ea6474, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x9ea6460, 0x9f28000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x9ea6460, 0x9f28000, 0x10000, 0x10000, 0x9b2ae00, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x9ae1d48, 0x9f28000, 0x10000, 0x10000, 0x4a00f34, 0x101, 0x4a00f08, 0x4d19ac)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x9ae1d48, 0x9f28000, 0x10000, 0x10000, 0x1cc3fc8, 0x11, 0x1b9a5b0, 0x9abcee0, 0x1828c90)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x9e8d9c0, 0x9ae1d48)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 15117 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x674f520, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e4d80, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e4d80, 0x5aaa740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x497c8a0, 0x65e4d80, 0x5aaa740)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 21720 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6e76160, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x80ac480, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x80ac480, 0x7564ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x70925b0, 0x80ac480, 0x7564ac0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33269 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x8ae02c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 13352 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6b8ad00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5ac4fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5ac4fc0, 0x6592180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x51180f0, 0x5ac4fc0, 0x6592180)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8960 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6e960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_KV_Actions(0x4d6e960)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:134 +0x1c
testing.tRunner(0x4d6e960, 0x1d9d33c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8637 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x5a6d030)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 8638 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5a6d030)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 8984 [chan receive, 4 minutes]:
testing.runTests.func1.1(0x48ce000)
	/usr/lib/go-1.13/src/testing/testing.go:1207 +0x28
created by testing.runTests.func1
	/usr/lib/go-1.13/src/testing/testing.go:1207 +0x98

goroutine 33691 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x96ccf50, 0x9847000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x951f080)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x951f080, 0x9840660, 0x0, 0x96ccf54)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x9840400, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x9b89e80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x951f0b0, 0x1845a70, 0x8ba17e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x951f0b0, 0x1845a70, 0x8ba17e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x951f050, 0x1845a70, 0x8ba17e0, 0x8ba17e0, 0x721a1f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x951f050, 0x8ba17e0, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x721a1e0, 0x2074928, 0x951f050, 0x768af90, 0x48a0400, 0x2074901, 0x0, 0x0, 0x20)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x721a1e0, 0x2074928, 0x951f050, 0x9b89f00, 0x8c9ac30, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9b89f00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x721a1e0, 0x2074928, 0x951f050, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x7814240, 0x2080c30, 0x96ccf50)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 8961 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6ea00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_UpdateCheck(0x4d6ea00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:408 +0x20
testing.tRunner(0x4d6ea00, 0x1d9d340)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8786 [select]:
github.com/hashicorp/yamux.(*Session).send(0x61cb2d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 33252 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x70dd4a0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8345 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x5c355e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 8641 [IO wait]:
internal/poll.runtime_pollWait(0xa5329b04, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5fd6654, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5fd6640, 0x57ac000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5fd6640, 0x57ac000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4e5ea68, 0x57ac000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5a9aa20, 0x4dc8f00, 0xc, 0xc, 0x61cb3b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x5a9aa20, 0x4dc8f00, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x61cb2d0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x61cb2d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8347 [select, 5 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x20732c8, 0x5becf60, 0x59daa0c, 0x20, 0x20, 0x54f82a0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5becec0, 0x20732c8, 0x5becf60, 0x54f82a8, 0x20732c8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5becec0, 0x5d45a40, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5a8b8c0, 0x509679c, 0x5d45a1c, 0x59dab70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:451 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x5902730, 0x5096770, 0x5d45a00, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4aab680, 0x523fde8, 0x13, 0x1cac698, 0x4, 0x59dad4c, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4aab680, 0x523fde8, 0x13, 0x5fb854c, 0x3, 0x3, 0xd1c8a701, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5c13be0, 0x5a1aa50, 0x5118aa8, 0x0, 0x5902820, 0x5c1b980, 0x1b9a5b0, 0x5096770, 0x16, 0x1828b00, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x5a1aa50, 0x2073568, 0x5bece00, 0x3f800000, 0x15c6b1c)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5a8b8c0, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x5a6d500, 0x1828b00, 0x5d459c0, 0xe0f8fc, 0x40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1104 +0x120
github.com/hashicorp/consul/agent.(*Agent).RPC(0x5c66160, 0x1cbf6ac, 0xf, 0x1b9a5b0, 0x5a6d500, 0x1828b00, 0x5d459c0, 0x4a8c244, 0x4ba01e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1598 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x551b258, 0xa, 0x0, 0xb2c97000, 0x8b, 0x5becde0, 0x2052fd0, 0x5a6d500, 0x0, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x205b7c0, 0x551b258, 0x4da31a0, 0x1828b00, 0x56e4300, 0x0, 0x0, 0x0, 0x0, 0xa, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:495 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:458 +0x308

goroutine 29865 [select]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x65ee540, 0x80a7168, 0x20732c8, 0x82bd040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 8787 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x61cb2d0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8771 [IO wait]:
internal/poll.runtime_pollWait(0xa5329a80, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5fd66a4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5fd6690, 0x57d9000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5fd6690, 0x57d9000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x54f83b0, 0x57d9000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4a8c930, 0x5118e14, 0xc, 0xc, 0x5096a80, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x4a8c930, 0x5118e14, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x5096a10, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x5096a10)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8772 [select]:
github.com/hashicorp/yamux.(*Session).send(0x5096a10)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 8773 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x5096a10)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8774 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x5096a80, 0x536f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x4a8c9c0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x4a8c9c0, 0x9ec2800, 0x0, 0x5096a84)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5bed940, 0x4bec580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x5645f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x4a8ca20, 0x1845a70, 0x62cc320)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x4a8ca20, 0x1845a70, 0x62cc320, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x4a8c990, 0x1845a70, 0x62cc320, 0x62cc320, 0x5a1aa68)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x4a8c990, 0x62cc320, 0x48a0474, 0x347368)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x5a1aa50, 0x2074928, 0x4a8c990, 0x6114f90, 0x48a0464, 0x2074901, 0x4a8c990, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x5a1aa50, 0x2074928, 0x4a8c990, 0x9b1d610, 0x59034a0, 0x76dc8, 0x347690, 0x32b5a1c, 0x48a03f0, 0x9b1d610, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x5a1aa50, 0x2074928, 0x4a8c990, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x5a8b8c0, 0x2080c30, 0x5096a80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:154 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:139 +0x14c

goroutine 26395 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x7d0f4e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8c52fc0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8c52fc0, 0x802c6c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7ef7c80, 0x8c52fc0, 0x802c6c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 31344 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x8427980, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x73b46c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x73b46c0, 0x953c580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x548b200, 0x73b46c0, 0x953c580)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33902 [IO wait]:
internal/poll.runtime_pollWait(0xa497be30, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9c1b554, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x9c1b540, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x9c1b540, 0x1, 0x2020501, 0xd498d4)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9e6dd00, 0x205e200, 0x8b3f0f0, 0x4d1768)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x9e6dd00, 0x0, 0xd4a75c, 0x7c2a100)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x9f08340, 0x9e6dd00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 8962 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6eaa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiIndex(0x4d6eaa0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:25 +0x20
testing.tRunner(0x4d6eaa0, 0x1d9d348)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 20594 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x78110e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x7815440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x7815440, 0x708f840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x7d1fec0, 0x7815440, 0x708f840)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8963 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6eb40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiNodes(0x4d6eb40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:69 +0x20
testing.tRunner(0x4d6eb40, 0x1d9d354)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 27999 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x73cb740, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x8aa2900, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x8aa2900, 0x895dd80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x8aeaf80, 0x8aa2900, 0x895dd80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 23012 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x6681020, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x6ded440, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x6ded440, 0x773ac80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x6f4a3b0, 0x6ded440, 0x773ac80)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 29631 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x7e4e050, 0x1cc0a02, 0xf, 0x2052fe8, 0x767e180, 0x1, 0x0, 0x1828bc8, 0x5d54c30, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x7e4e050, 0x20732c8, 0x73972e0, 0x1cc0a02, 0xf, 0x2052fe8, 0x767e180, 0x1cb690f, 0xa, 0x63bfc00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33921 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x9b1e6c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 33270 [select]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x8ae02c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 8959 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6e8c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_Size_Ops(0x4d6e8c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:107 +0x20
testing.tRunner(0x4d6e8c0, 0x1d9d330)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8964 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6ebe0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiNodes_Filter(0x4d6ebe0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:107 +0x20
testing.tRunner(0x4d6ebe0, 0x1d9d350)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 33687 [select]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x96ccee0, 0x1d9d490, 0x7814240, 0x2080c30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x7814240, 0x2080d50, 0x8e85be0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:132 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x7814240, 0x2080d50, 0x8e85be0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:107 +0x454
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 10217 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5e06a40, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x5a51b00, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a51b00, 0x6844080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5af79d0, 0x5a51b00, 0x6844080)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 8966 [chan receive, 4 minutes]:
testing.(*testContext).waitParallel(0x49d1e40)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4d6ed20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiServices(0x4d6ed20)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:199 +0x20
testing.tRunner(0x4d6ed20, 0x1d9d358)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 11635 [select, 4 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x5ffe190, 0x1cc0a02, 0xf, 0x2052fe8, 0x5df3440, 0x1, 0x0, 0x1828bc8, 0x59f5830, 0x0, ...)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:376 +0x454
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x5ffe190, 0x20732c8, 0x68906a0, 0x1cc0a02, 0xf, 0x2052fe8, 0x5df3440, 0x1cb690f, 0xa, 0x62bb540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 33964 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x9f08540)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 18079 [chan receive, 3 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x76d3120, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leadershipTransfer(0x65e46c0, 0x1d1cbe5, 0x30)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:133 +0xf8
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x65e46c0, 0x7214d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:187 +0x6a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x516a330, 0x65e46c0, 0x7214d00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8

goroutine 33663 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x8e8a000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1974 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:500 +0x744

goroutine 33664 [select]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x8e8a000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 33665 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x8e8a000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 33666 [IO wait]:
internal/poll.runtime_pollWait(0xa5332338, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x951a4c4, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x951a4b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x951a4b0, 0x4ba0000, 0xb6d9936c, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9e0d9a0, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x9e0d9a0, 0x946c658, 0x951a4b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x636d100, 0x206b9a8, 0x9e0d9a0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x636d100, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x91a7800, 0x1cac2e9, 0x3, 0x7c13fd0, 0xf, 0x9e0d990, 0x8744680, 0x2)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x8e8a000, 0x91a7800, 0x974f700, 0x974f740, 0x205db30, 0x95c9000)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 33667 [IO wait]:
internal/poll.runtime_pollWait(0xa53321ac, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9263374, 0x72, 0xff00, 0xffff, 0x8170240)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x9263360, 0x9e2a000, 0xffff, 0xffff, 0x8170240, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x9263360, 0x9e2a000, 0xffff, 0xffff, 0x8170240, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x99c0e70, 0x9e2a000, 0xffff, 0xffff, 0x8170240, 0x28, 0x28, 0xb6d99008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x99c0e70, 0x9e2a000, 0xffff, 0xffff, 0x8170240, 0x28, 0x28, 0xb6d99008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x99c0e70, 0x9e2a000, 0xffff, 0xffff, 0x61, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x8ae2780, 0x99c0e70, 0x77359400, 0x0, 0x2054601, 0x32c97a8, 0xa49a7110, 0x32c97a8, 0xffffff01, 0xa49a70f0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x99c0f00, 0x99c0e70, 0x77359400, 0x0, 0x9b3d680, 0x1, 0x0, 0x0, 0x2054770, 0x9b3d680)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x8ae2780, 0x99c0e70, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x8ae2780, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x91a7830, 0x1cac30d, 0x3, 0x8bfe840, 0xf, 0x9d77d90, 0x7229c, 0x708df3c)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x8e8a000, 0x91a7830, 0x974f700, 0x974f740, 0x205db48, 0x95c9040)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 33672 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x88ced70)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:496 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 33684 [IO wait]:
internal/poll.runtime_pollWait(0xa5331f9c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x92635f4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x92635e0, 0x97b3000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x92635e0, 0x97b3000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a75c0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x8e85bc0, 0x97b3000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x951eea0, 0x8f70120, 0xc, 0xc, 0x96cce70, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x2052640, 0x951eea0, 0x8f70120, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x96ccd90, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x96ccd90)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 33970 [chan receive]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x9ed4e10, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x7501e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:493 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:492 +0x728

goroutine 33972 [select]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x7501e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:503 +0x760

goroutine 33973 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x7501e40)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1889 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:507 +0xa54

goroutine 33974 [IO wait]:
internal/poll.runtime_pollWait(0xa497c3dc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x940dd24, 0x72, 0x0, 0x0, 0x1cb0ade)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x940dd10, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x940dd10, 0x48654a0, 0xb6d99a34, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x9e6b870, 0x41c478, 0x8, 0x19c0dc8)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x9e6b870, 0x9c251e0, 0x940dd10, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x84db180, 0x206b9a8, 0x9e6b870, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x84db180, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x9ed4e40, 0x1cac2e9, 0x3, 0x9eda140, 0xf, 0x9e6b860, 0x1828c90, 0x5f5f00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x7501e40, 0x9ed4e40, 0x9f08600, 0x9f08640, 0x205db30, 0x9b51480)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 33975 [IO wait]:
internal/poll.runtime_pollWait(0xa5331d08, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x9c1b6e4, 0x72, 0xff00, 0xffff, 0x8170f00)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x9c1b6d0, 0x9f58000, 0xffff, 0xffff, 0x8170f00, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x9c1b6d0, 0x9f58000, 0xffff, 0xffff, 0x8170f00, 0x28, 0x28, 0x0, 0x1, 0xa09d0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x9f064d0, 0x9f58000, 0xffff, 0xffff, 0x8170f00, 0x28, 0x28, 0xb6d99008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x9f064d0, 0x9f58000, 0xffff, 0xffff, 0x8170f00, 0x28, 0x28, 0xb6d99008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x9f064d0, 0x9f58000, 0xffff, 0xffff, 0x62, 0x32b9570, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x8776e80, 0x9f064d0, 0x77359400, 0x0, 0x9ed4f01, 0x41c7b8, 0x8, 0x1be6378, 0x205db01, 0x9f064f8)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x9f064f8, 0x9f064d0, 0x77359400, 0x0, 0x40000000, 0x0, 0x0, 0x542de4c, 0x3dee50, 0x9ed4f14)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x8776e80, 0x9f064d0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x8776e80, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x9ed4e70, 0x1cac30d, 0x3, 0x9ecd9f0, 0xf, 0x9f4e500, 0x99550a0, 0x2)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/dns.go:203 +0x234
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x7501e40, 0x9ed4e70, 0x9f08600, 0x9f08640, 0x205db48, 0x9b514c0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:762 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xd4

goroutine 33978 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).retrySyncFullEventFn(0x9992780, 0x0, 0x37)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:236 +0x1b8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x9992780, 0x1cbc90d, 0xd, 0x1cbc90d, 0xd)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:176 +0x60
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x9992780, 0x1cb2cd8, 0x8, 0x8675fdc)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x9992780)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/agent.go:1820 +0x30

goroutine 34021 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x9f8e1e0, 0x0, 0x0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x99cd200, 0x9f88e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:168 +0x4a0
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x9eda124, 0x99cd200, 0x9f88e00)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:80 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:78 +0x1b8
FAIL	github.com/hashicorp/consul/agent	421.463s
=== RUN   TestAE_scaleFactor
=== PAUSE TestAE_scaleFactor
=== RUN   TestAE_Pause_nestedPauseResume
=== PAUSE TestAE_Pause_nestedPauseResume
=== RUN   TestAE_Pause_ResumeTriggersSyncChanges
--- PASS: TestAE_Pause_ResumeTriggersSyncChanges (0.00s)
=== RUN   TestAE_staggerDependsOnClusterSize
--- PASS: TestAE_staggerDependsOnClusterSize (0.00s)
=== RUN   TestAE_Run_SyncFullBeforeChanges
--- PASS: TestAE_Run_SyncFullBeforeChanges (0.00s)
=== RUN   TestAE_Run_Quit
=== RUN   TestAE_Run_Quit/Run_panics_without_ClusterSize
=== RUN   TestAE_Run_Quit/runFSM_quits
--- PASS: TestAE_Run_Quit (0.00s)
    --- PASS: TestAE_Run_Quit/Run_panics_without_ClusterSize (0.00s)
    --- PASS: TestAE_Run_Quit/runFSM_quits (0.00s)
=== RUN   TestAE_FSM
=== RUN   TestAE_FSM/fullSyncState
=== RUN   TestAE_FSM/fullSyncState/Paused_->_retryFullSyncState
=== RUN   TestAE_FSM/fullSyncState/SyncFull()_error_->_retryFullSyncState
[ERR] agent: failed to sync remote state: boom
=== RUN   TestAE_FSM/fullSyncState/SyncFull()_OK_->_partialSyncState
=== RUN   TestAE_FSM/retryFullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/shutdownEvent_->_doneState
=== RUN   TestAE_FSM/retryFullSyncState/syncFullNotifEvent_->_fullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/syncFullTimerEvent_->_fullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/invalid_event_->_panic_
=== RUN   TestAE_FSM/partialSyncState
=== RUN   TestAE_FSM/partialSyncState/shutdownEvent_->_doneState
=== RUN   TestAE_FSM/partialSyncState/syncFullNotifEvent_->_fullSyncState
=== RUN   TestAE_FSM/partialSyncState/syncFullTimerEvent_->_fullSyncState
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+Paused_->_partialSyncState
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_error_->_partialSyncState
[ERR] agent: failed to sync changes: boom
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_OK_->_partialSyncState
=== RUN   TestAE_FSM/partialSyncState/invalid_event_->_panic_
=== RUN   TestAE_FSM/invalid_state_->_panic_
--- PASS: TestAE_FSM (0.01s)
    --- PASS: TestAE_FSM/fullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/Paused_->_retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/SyncFull()_error_->_retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/SyncFull()_OK_->_partialSyncState (0.00s)
    --- PASS: TestAE_FSM/retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/shutdownEvent_->_doneState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/syncFullNotifEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/syncFullTimerEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/invalid_event_->_panic_ (0.00s)
    --- PASS: TestAE_FSM/partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/shutdownEvent_->_doneState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncFullNotifEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncFullTimerEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+Paused_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_error_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_OK_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/invalid_event_->_panic_ (0.00s)
    --- PASS: TestAE_FSM/invalid_state_->_panic_ (0.00s)
=== RUN   TestAE_RetrySyncFullEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_shutdownEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_shutdownEvent_during_FullNotif
=== RUN   TestAE_RetrySyncFullEvent/trigger_syncFullNotifEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_syncFullTimerEvent
--- PASS: TestAE_RetrySyncFullEvent (0.13s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_shutdownEvent (0.00s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_shutdownEvent_during_FullNotif (0.10s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_syncFullNotifEvent (0.01s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_syncFullTimerEvent (0.02s)
=== RUN   TestAE_SyncChangesEvent
=== RUN   TestAE_SyncChangesEvent/trigger_shutdownEvent
=== RUN   TestAE_SyncChangesEvent/trigger_shutdownEvent_during_FullNotif
=== RUN   TestAE_SyncChangesEvent/trigger_syncFullNotifEvent
=== RUN   TestAE_SyncChangesEvent/trigger_syncFullTimerEvent
=== RUN   TestAE_SyncChangesEvent/trigger_syncChangesNotifEvent
--- PASS: TestAE_SyncChangesEvent (0.13s)
    --- PASS: TestAE_SyncChangesEvent/trigger_shutdownEvent (0.00s)
    --- PASS: TestAE_SyncChangesEvent/trigger_shutdownEvent_during_FullNotif (0.10s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncFullNotifEvent (0.01s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncFullTimerEvent (0.02s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncChangesNotifEvent (0.00s)
=== CONT  TestAE_scaleFactor
=== RUN   TestAE_scaleFactor/100_nodes
=== RUN   TestAE_scaleFactor/200_nodes
=== RUN   TestAE_scaleFactor/1000_nodes
=== RUN   TestAE_scaleFactor/10000_nodes
--- PASS: TestAE_scaleFactor (0.00s)
    --- PASS: TestAE_scaleFactor/100_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/200_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/1000_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/10000_nodes (0.00s)
=== CONT  TestAE_Pause_nestedPauseResume
--- PASS: TestAE_Pause_nestedPauseResume (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/ae	0.334s
=== RUN   TestParseFlags
=== RUN   TestParseFlags/#00
=== RUN   TestParseFlags/-bind_a
=== RUN   TestParseFlags/-bootstrap
=== RUN   TestParseFlags/-bootstrap=true
=== RUN   TestParseFlags/-bootstrap=false
=== RUN   TestParseFlags/-config-file_a_-config-dir_b_-config-file_c_-config-dir_d
=== RUN   TestParseFlags/-datacenter_a
=== RUN   TestParseFlags/-dns-port_1
=== RUN   TestParseFlags/-grpc-port_1
=== RUN   TestParseFlags/-serf-lan-port_1
=== RUN   TestParseFlags/-serf-wan-port_1
=== RUN   TestParseFlags/-server-port_1
=== RUN   TestParseFlags/-join_a_-join_b
=== RUN   TestParseFlags/-node-meta_a:b_-node-meta_c:d
=== RUN   TestParseFlags/-bootstrap_true
--- PASS: TestParseFlags (0.04s)
    --- PASS: TestParseFlags/#00 (0.00s)
    --- PASS: TestParseFlags/-bind_a (0.00s)
    --- PASS: TestParseFlags/-bootstrap (0.00s)
    --- PASS: TestParseFlags/-bootstrap=true (0.00s)
    --- PASS: TestParseFlags/-bootstrap=false (0.00s)
    --- PASS: TestParseFlags/-config-file_a_-config-dir_b_-config-file_c_-config-dir_d (0.00s)
    --- PASS: TestParseFlags/-datacenter_a (0.00s)
    --- PASS: TestParseFlags/-dns-port_1 (0.00s)
    --- PASS: TestParseFlags/-grpc-port_1 (0.00s)
    --- PASS: TestParseFlags/-serf-lan-port_1 (0.00s)
    --- PASS: TestParseFlags/-serf-wan-port_1 (0.00s)
    --- PASS: TestParseFlags/-server-port_1 (0.00s)
    --- PASS: TestParseFlags/-join_a_-join_b (0.00s)
    --- PASS: TestParseFlags/-node-meta_a:b_-node-meta_c:d (0.00s)
    --- PASS: TestParseFlags/-bootstrap_true (0.00s)
=== RUN   TestMerge
=== RUN   TestMerge/top_level_fields
--- PASS: TestMerge (0.01s)
    --- PASS: TestMerge/top_level_fields (0.01s)
=== RUN   TestPatchSliceOfMaps
=== RUN   TestPatchSliceOfMaps/00:_{"a":{"b":"c"}}_->_{"a":{"b":"c"}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/01:_{"a":[{"b":"c"}]}_->_{"a":{"b":"c"}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/02:_{"a":[{"b":[{"c":"d"}]}]}_->_{"a":{"b":{"c":"d"}}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/03:_{"a":[{"b":"c"}]}_->_{"a":[{"b":"c"}]}_skip:_[a]
=== RUN   TestPatchSliceOfMaps/04:_{_____"services":_[______{_______"checks":_[________{_________"header":_[__________{"a":"b"}_________]________}_______]______}_____]____}_->_{_____"services":_[______{_______"checks":_[________{_________"header":_{"a":"b"}________}_______]______}_____]____}_skip:_[services_services.checks]
--- PASS: TestPatchSliceOfMaps (0.01s)
    --- PASS: TestPatchSliceOfMaps/00:_{"a":{"b":"c"}}_->_{"a":{"b":"c"}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/01:_{"a":[{"b":"c"}]}_->_{"a":{"b":"c"}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/02:_{"a":[{"b":[{"c":"d"}]}]}_->_{"a":{"b":{"c":"d"}}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/03:_{"a":[{"b":"c"}]}_->_{"a":[{"b":"c"}]}_skip:_[a] (0.00s)
    --- PASS: TestPatchSliceOfMaps/04:_{_____"services":_[______{_______"checks":_[________{_________"header":_[__________{"a":"b"}_________]________}_______]______}_____]____}_->_{_____"services":_[______{_______"checks":_[________{_________"header":_{"a":"b"}________}_______]______}_____]____}_skip:_[services_services.checks] (0.00s)
=== RUN   TestConfigFlagsAndEdgecases
=== RUN   TestConfigFlagsAndEdgecases/-advertise
=== RUN   TestConfigFlagsAndEdgecases/-advertise-wan
=== RUN   TestConfigFlagsAndEdgecases/-advertise_and_-advertise-wan
=== RUN   TestConfigFlagsAndEdgecases/-bind
=== RUN   TestConfigFlagsAndEdgecases/-bootstrap
=== RUN   TestConfigFlagsAndEdgecases/-bootstrap-expect
=== RUN   TestConfigFlagsAndEdgecases/-client
=== RUN   TestConfigFlagsAndEdgecases/-config-dir
=== RUN   TestConfigFlagsAndEdgecases/-config-file_json
=== RUN   TestConfigFlagsAndEdgecases/-config-file_hcl_and_json
=== RUN   TestConfigFlagsAndEdgecases/-data-dir_empty
=== RUN   TestConfigFlagsAndEdgecases/-data-dir_non-directory
=== RUN   TestConfigFlagsAndEdgecases/-datacenter
=== RUN   TestConfigFlagsAndEdgecases/-datacenter_empty
=== RUN   TestConfigFlagsAndEdgecases/-dev
=== RUN   TestConfigFlagsAndEdgecases/-disable-host-node-id
=== RUN   TestConfigFlagsAndEdgecases/-disable-keyring-file
=== RUN   TestConfigFlagsAndEdgecases/-dns-port
=== RUN   TestConfigFlagsAndEdgecases/-domain
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_service
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can_be_prefixed_by_non-keywords
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC#01
=== RUN   TestConfigFlagsAndEdgecases/-enable-script-checks
=== RUN   TestConfigFlagsAndEdgecases/-encrypt
=== RUN   TestConfigFlagsAndEdgecases/-config-format_disabled,_skip_unknown_files
=== RUN   TestConfigFlagsAndEdgecases/-config-format=json
=== RUN   TestConfigFlagsAndEdgecases/-config-format=hcl
=== RUN   TestConfigFlagsAndEdgecases/-config-format_invalid
=== RUN   TestConfigFlagsAndEdgecases/-http-port
=== RUN   TestConfigFlagsAndEdgecases/-join
=== RUN   TestConfigFlagsAndEdgecases/-join-wan
=== RUN   TestConfigFlagsAndEdgecases/-log-level
=== RUN   TestConfigFlagsAndEdgecases/-node
=== RUN   TestConfigFlagsAndEdgecases/-node-id
=== RUN   TestConfigFlagsAndEdgecases/-node-meta
=== RUN   TestConfigFlagsAndEdgecases/-non-voting-server
=== RUN   TestConfigFlagsAndEdgecases/-pid-file
=== RUN   TestConfigFlagsAndEdgecases/-protocol
=== RUN   TestConfigFlagsAndEdgecases/-raft-protocol
=== RUN   TestConfigFlagsAndEdgecases/-recursor
=== RUN   TestConfigFlagsAndEdgecases/-rejoin
=== RUN   TestConfigFlagsAndEdgecases/-retry-interval
=== RUN   TestConfigFlagsAndEdgecases/-retry-interval-wan
=== RUN   TestConfigFlagsAndEdgecases/-retry-join
=== RUN   TestConfigFlagsAndEdgecases/-retry-join-wan
=== RUN   TestConfigFlagsAndEdgecases/-retry-max
=== RUN   TestConfigFlagsAndEdgecases/-retry-max-wan
=== RUN   TestConfigFlagsAndEdgecases/-serf-lan-bind
=== RUN   TestConfigFlagsAndEdgecases/-serf-lan-port
=== RUN   TestConfigFlagsAndEdgecases/-serf-wan-bind
=== RUN   TestConfigFlagsAndEdgecases/-serf-wan-port
=== RUN   TestConfigFlagsAndEdgecases/-server
=== RUN   TestConfigFlagsAndEdgecases/-server-port
=== RUN   TestConfigFlagsAndEdgecases/-syslog
=== RUN   TestConfigFlagsAndEdgecases/-ui
=== RUN   TestConfigFlagsAndEdgecases/-ui-dir
=== RUN   TestConfigFlagsAndEdgecases/-ui-content-path
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v4
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v6
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_and_advertise_set_should_not_detect
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_and_advertise_set_should_not_detect
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_>_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_>_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:client_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:client,_address_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client,_address_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_lan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_wan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/json:allow_disabling_serf_wan_port
=== RUN   TestConfigFlagsAndEdgecases/hcl:allow_disabling_serf_wan_port
=== RUN   TestConfigFlagsAndEdgecases/json:serf_bind_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:serf_bind_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/json:serf_bind_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:serf_bind_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/json:dns_recursor_templates_with_deduplication
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_recursor_templates_with_deduplication
=== RUN   TestConfigFlagsAndEdgecases/json:start_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:start_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:start_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:start_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:retry_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:retry_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:retry_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:retry_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:precedence:_merge_order
=== RUN   TestConfigFlagsAndEdgecases/hcl:precedence:_merge_order
=== RUN   TestConfigFlagsAndEdgecases/json:precedence:_flag_before_file
=== RUN   TestConfigFlagsAndEdgecases/hcl:precedence:_flag_before_file
=== RUN   TestConfigFlagsAndEdgecases/json:raft_performance_scaling
=== RUN   TestConfigFlagsAndEdgecases/hcl:raft_performance_scaling
=== RUN   TestConfigFlagsAndEdgecases/json:invalid_input
=== RUN   TestConfigFlagsAndEdgecases/hcl:invalid_input
=== RUN   TestConfigFlagsAndEdgecases/json:datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/hcl:datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/json:acl_datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/json:acl_replication_token_enables_acl_replication
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_replication_token_enables_acl_replication
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v6
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v6
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v6
=== RUN   TestConfigFlagsAndEdgecases/ae_interval_invalid_==_0
=== RUN   TestConfigFlagsAndEdgecases/ae_interval_invalid_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:acl_datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:autopilot.max_trailing_logs_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:autopilot.max_trailing_logs_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_does_not_allow_multiple_addresses
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_does_not_allow_multiple_addresses
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_a_unix_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_a_unix_socket
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap_without_server
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap_without_server
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_without_server
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_without_server
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_dev_mode
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_dev_mode
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect=1_equals_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=1_equals_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect=2_warning
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=2_warning
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_>_2_but_even_warning
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_>_2_but_even_warning
=== RUN   TestConfigFlagsAndEdgecases/json:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly
=== RUN   TestConfigFlagsAndEdgecases/json:client_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/json:datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:dns_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/json:ui_and_ui_dir
=== RUN   TestConfigFlagsAndEdgecases/hcl:ui_and_ui_dir
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_addr_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_addr_any
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_addr_wan_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_addr_wan_any
=== RUN   TestConfigFlagsAndEdgecases/json:recursors_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:recursors_any
=== RUN   TestConfigFlagsAndEdgecases/json:dns_config.udp_answer_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_config.udp_answer_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:dns_config.a_record_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_config.a_record_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_>_10
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_>_10
=== RUN   TestConfigFlagsAndEdgecases/node_name_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_key_too_long
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_key_too_long
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_value_too_long
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_value_too_long
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_too_many_keys
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_too_many_keys
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_http
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_http
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_https
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_https
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_http_vs_https
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_http_vs_https
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_HTTP_vs_RPC
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_HTTP_vs_RPC
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_LAN
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_LAN
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_WAN
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_WAN
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_ID
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_ID
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_nested_sidecar
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_nested_sidecar
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_managed_proxy
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_managed_proxy
=== RUN   TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_must_start_with_+_or_-
=== RUN   TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_must_start_with_+_or_-
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_has_invalid_key
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_has_invalid_key
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_given_but_LAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_LAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_given_but_WAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_WAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/json:multiple_check_files
=== RUN   TestConfigFlagsAndEdgecases/hcl:multiple_check_files
=== RUN   TestConfigFlagsAndEdgecases/json:grpc_check
=== RUN   TestConfigFlagsAndEdgecases/hcl:grpc_check
=== RUN   TestConfigFlagsAndEdgecases/json:alias_check_with_no_node
=== RUN   TestConfigFlagsAndEdgecases/hcl:alias_check_with_no_node
=== RUN   TestConfigFlagsAndEdgecases/json:multiple_service_files
=== RUN   TestConfigFlagsAndEdgecases/hcl:multiple_service_files
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_key
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_key
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_value
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_value
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_many_meta
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_many_meta
=== RUN   TestConfigFlagsAndEdgecases/json:translated_keys
=== RUN   TestConfigFlagsAndEdgecases/hcl:translated_keys
=== RUN   TestConfigFlagsAndEdgecases/json:ignore_snapshot_agent_sub-object
=== RUN   TestConfigFlagsAndEdgecases/hcl:ignore_snapshot_agent_sub-object
=== RUN   TestConfigFlagsAndEdgecases/json:Service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/hcl:Service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/json:Multiple_service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/hcl:Multiple_service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_root
=== RUN   TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_root
=== RUN   TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_api_registration
=== RUN   TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_api_registration
=== RUN   TestConfigFlagsAndEdgecases/json:service.connectsidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/hcl:service.connectsidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/json:services.connect.sidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/hcl:services.connect.sidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/json:verify_server_hostname_implies_verify_outgoing
=== RUN   TestConfigFlagsAndEdgecases/hcl:verify_server_hostname_implies_verify_outgoing
=== RUN   TestConfigFlagsAndEdgecases/json:test_connect_vault_provider_configuration
=== RUN   TestConfigFlagsAndEdgecases/hcl:test_connect_vault_provider_configuration
=== RUN   TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_doesn't_parse
=== RUN   TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_doesn't_parse
=== RUN   TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_unknown_kind
=== RUN   TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_unknown_kind
=== RUN   TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_invalid
--- PASS: TestConfigFlagsAndEdgecases (32.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise-wan (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise_and_-advertise-wan (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-bind (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-bootstrap (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-bootstrap-expect (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-client (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-dir (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-file_json (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-file_hcl_and_json (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-data-dir_empty (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/-data-dir_non-directory (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-datacenter (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-datacenter_empty (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-dev (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-disable-host-node-id (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/-disable-keyring-file (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/-dns-port (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/-domain (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_service (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can_be_prefixed_by_non-keywords (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC#01 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-enable-script-checks (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-encrypt (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format_disabled,_skip_unknown_files (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format=json (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format=hcl (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format_invalid (0.00s)
    --- PASS: TestConfigFlagsAndEdgecases/-http-port (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-join (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-join-wan (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-log-level (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-node (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-node-id (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-node-meta (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-non-voting-server (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/-pid-file (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-protocol (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-raft-protocol (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-recursor (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-rejoin (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-interval (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-interval-wan (0.28s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-join (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-join-wan (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-max (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-max-wan (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-lan-bind (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-lan-port (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-wan-bind (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-wan-port (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-server (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-server-port (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-syslog (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui-dir (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui-content-path (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_v4 (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v4 (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_v6 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v6 (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_and_advertise_set_should_not_detect (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_and_advertise_set_should_not_detect (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_==_0 (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_==_0 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_<_0 (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_<_0 (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_>_0 (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_>_0 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_==_0 (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_==_0 (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_<_0 (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_<_0 (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports (0.26s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_template_and_ports (0.31s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_template_and_ports (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client,_address_template_and_ports (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client,_address_template_and_ports (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_lan_template (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_template (0.32s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_wan_template (0.27s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_template (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_lan_with_ports (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_with_ports (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_wan_with_ports (0.29s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_with_ports (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/json:allow_disabling_serf_wan_port (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:allow_disabling_serf_wan_port (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:serf_bind_address_lan_template (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:serf_bind_address_lan_template (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:serf_bind_address_wan_template (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:serf_bind_address_wan_template (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_recursor_templates_with_deduplication (0.31s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_recursor_templates_with_deduplication (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:start_join_address_template (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:start_join_address_template (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:start_join_wan_address_template (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:start_join_wan_address_template (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:retry_join_address_template (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:retry_join_address_template (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:retry_join_wan_address_template (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:retry_join_wan_address_template (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:precedence:_merge_order (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:precedence:_merge_order (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:precedence:_flag_before_file (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:precedence:_flag_before_file (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:raft_performance_scaling (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:raft_performance_scaling (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:invalid_input (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:invalid_input (0.02s)
    --- PASS: TestConfigFlagsAndEdgecases/json:datacenter_is_lower-cased (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:datacenter_is_lower-cased (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_datacenter_is_lower-cased (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_datacenter_is_lower-cased (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_replication_token_enables_acl_replication (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_replication_token_enables_acl_replication (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v4 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v4 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v4 (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v4 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v4 (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v4 (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v6 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v6 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v6 (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v6 (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v6 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v6 (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/ae_interval_invalid_==_0 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/ae_interval_invalid_<_0 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_datacenter_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_datacenter_invalid (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:autopilot.max_trailing_logs_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:autopilot.max_trailing_logs_invalid (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_empty (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_empty (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_does_not_allow_multiple_addresses (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_does_not_allow_multiple_addresses (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_a_unix_socket (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_a_unix_socket (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap_without_server (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap_without_server (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_without_server (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_without_server (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_invalid (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_dev_mode (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_dev_mode (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_bootstrap (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_bootstrap (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect=1_equals_bootstrap (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=1_equals_bootstrap (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect=2_warning (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=2_warning (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_>_2_but_even_warning (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_>_2_but_even_warning (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_does_not_allow_socket (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_does_not_allow_socket (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:datacenter_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:datacenter_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_does_not_allow_socket (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_does_not_allow_socket (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ui_and_ui_dir (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ui_and_ui_dir (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_addr_any (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_addr_any (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_addr_wan_any (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_addr_wan_any (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:recursors_any (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:recursors_any (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_config.udp_answer_limit_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_config.udp_answer_limit_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_config.a_record_limit_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_config.a_record_limit_invalid (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_<_0 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_<_0 (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_==_0 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_==_0 (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_>_10 (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_>_10 (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/node_name_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_key_too_long (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_key_too_long (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_value_too_long (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_value_too_long (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_too_many_keys (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_too_many_keys (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_http (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_http (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_https (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_https (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_http_vs_https (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_http_vs_https (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_HTTP_vs_RPC (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_HTTP_vs_RPC (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_LAN (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_LAN (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_WAN (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_WAN (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_ID (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_ID (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_nested_sidecar (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_nested_sidecar (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_managed_proxy (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_managed_proxy (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_cannot_be_empty (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_cannot_be_empty (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_must_start_with_+_or_- (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_must_start_with_+_or_- (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_has_invalid_key (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_has_invalid_key (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_given_but_LAN_keyring_exists (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_LAN_keyring_exists (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_given_but_WAN_keyring_exists (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_WAN_keyring_exists (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:multiple_check_files (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:multiple_check_files (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:grpc_check (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:grpc_check (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:alias_check_with_no_node (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:alias_check_with_no_node (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:multiple_service_files (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:multiple_service_files (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_key (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_key (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_value (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_value (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_many_meta (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_many_meta (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:translated_keys (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:translated_keys (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ignore_snapshot_agent_sub-object (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ignore_snapshot_agent_sub-object (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:Service_managed_proxy_'upstreams' (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:Service_managed_proxy_'upstreams' (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/json:Multiple_service_managed_proxy_'upstreams' (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:Multiple_service_managed_proxy_'upstreams' (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_root (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_root (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_api_registration (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_api_registration (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service.connectsidecar_service_with_checks_and_upstreams (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service.connectsidecar_service_with_checks_and_upstreams (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:services.connect.sidecar_service_with_checks_and_upstreams (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:services.connect.sidecar_service_with_checks_and_upstreams (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:verify_server_hostname_implies_verify_outgoing (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:verify_server_hostname_implies_verify_outgoing (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:test_connect_vault_provider_configuration (0.26s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:test_connect_vault_provider_configuration (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_doesn't_parse (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_doesn't_parse (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_unknown_kind (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_unknown_kind (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_invalid (0.05s)
=== RUN   TestFullConfig
=== RUN   TestFullConfig/json
=== RUN   TestFullConfig/hcl
--- PASS: TestFullConfig (0.53s)
    runtime_test.go:4873: "RuntimeConfig.ACLEnableKeyListPolicy" is zero value
    --- PASS: TestFullConfig/json (0.23s)
    --- PASS: TestFullConfig/hcl (0.31s)
=== RUN   TestNonZero
=== RUN   TestNonZero/nil
=== RUN   TestNonZero/zero_bool
=== RUN   TestNonZero/zero_string
=== RUN   TestNonZero/zero_int
=== RUN   TestNonZero/zero_int8
=== RUN   TestNonZero/zero_int16
=== RUN   TestNonZero/zero_int32
=== RUN   TestNonZero/zero_int64
=== RUN   TestNonZero/zero_uint
=== RUN   TestNonZero/zero_uint8
=== RUN   TestNonZero/zero_uint16
=== RUN   TestNonZero/zero_uint32
=== RUN   TestNonZero/zero_uint64
=== RUN   TestNonZero/zero_float32
=== RUN   TestNonZero/zero_float64
=== RUN   TestNonZero/ptr_to_zero_value
=== RUN   TestNonZero/empty_slice
=== RUN   TestNonZero/slice_with_zero_value
=== RUN   TestNonZero/empty_map
=== RUN   TestNonZero/map_with_zero_value_key
=== RUN   TestNonZero/map_with_zero_value_elem
=== RUN   TestNonZero/struct_with_nil_field
=== RUN   TestNonZero/struct_with_zero_value_field
=== RUN   TestNonZero/struct_with_empty_array
--- PASS: TestNonZero (0.01s)
    --- PASS: TestNonZero/nil (0.00s)
    --- PASS: TestNonZero/zero_bool (0.00s)
    --- PASS: TestNonZero/zero_string (0.00s)
    --- PASS: TestNonZero/zero_int (0.00s)
    --- PASS: TestNonZero/zero_int8 (0.00s)
    --- PASS: TestNonZero/zero_int16 (0.00s)
    --- PASS: TestNonZero/zero_int32 (0.00s)
    --- PASS: TestNonZero/zero_int64 (0.00s)
    --- PASS: TestNonZero/zero_uint (0.00s)
    --- PASS: TestNonZero/zero_uint8 (0.00s)
    --- PASS: TestNonZero/zero_uint16 (0.00s)
    --- PASS: TestNonZero/zero_uint32 (0.00s)
    --- PASS: TestNonZero/zero_uint64 (0.00s)
    --- PASS: TestNonZero/zero_float32 (0.00s)
    --- PASS: TestNonZero/zero_float64 (0.00s)
    --- PASS: TestNonZero/ptr_to_zero_value (0.00s)
    --- PASS: TestNonZero/empty_slice (0.00s)
    --- PASS: TestNonZero/slice_with_zero_value (0.00s)
    --- PASS: TestNonZero/empty_map (0.00s)
    --- PASS: TestNonZero/map_with_zero_value_key (0.00s)
    --- PASS: TestNonZero/map_with_zero_value_elem (0.00s)
    --- PASS: TestNonZero/struct_with_nil_field (0.00s)
    --- PASS: TestNonZero/struct_with_zero_value_field (0.00s)
    --- PASS: TestNonZero/struct_with_empty_array (0.00s)
=== RUN   TestConfigDecodeBytes
=== PAUSE TestConfigDecodeBytes
=== RUN   TestSanitize
--- PASS: TestSanitize (0.02s)
=== RUN   TestRuntime_apiAddresses
--- PASS: TestRuntime_apiAddresses (0.00s)
=== RUN   TestRuntime_APIConfigHTTPS
--- PASS: TestRuntime_APIConfigHTTPS (0.00s)
=== RUN   TestRuntime_APIConfigHTTP
--- PASS: TestRuntime_APIConfigHTTP (0.00s)
=== RUN   TestRuntime_APIConfigUNIX
--- PASS: TestRuntime_APIConfigUNIX (0.00s)
=== RUN   TestRuntime_APIConfigANYAddrV4
--- PASS: TestRuntime_APIConfigANYAddrV4 (0.00s)
=== RUN   TestRuntime_APIConfigANYAddrV6
--- PASS: TestRuntime_APIConfigANYAddrV6 (0.00s)
=== RUN   TestRuntime_ClientAddress
--- PASS: TestRuntime_ClientAddress (0.00s)
=== RUN   TestRuntime_ClientAddressAnyV4
--- PASS: TestRuntime_ClientAddressAnyV4 (0.00s)
=== RUN   TestRuntime_ClientAddressAnyV6
--- PASS: TestRuntime_ClientAddressAnyV6 (0.00s)
=== RUN   TestRuntime_ToTLSUtilConfig
--- PASS: TestRuntime_ToTLSUtilConfig (0.00s)
=== RUN   TestReadPath
=== RUN   TestReadPath/dir_skip_non_json_or_hcl_if_config-format_not_set
=== RUN   TestReadPath/dir_read_non_json_or_hcl_if_config-format_set
=== RUN   TestReadPath/file_skip_non_json_or_hcl_if_config-format_not_set
=== RUN   TestReadPath/file_read_non_json_or_hcl_if_config-format_set
--- PASS: TestReadPath (0.05s)
    --- PASS: TestReadPath/dir_skip_non_json_or_hcl_if_config-format_not_set (0.00s)
    --- PASS: TestReadPath/dir_read_non_json_or_hcl_if_config-format_set (0.01s)
    --- PASS: TestReadPath/file_skip_non_json_or_hcl_if_config-format_not_set (0.01s)
    --- PASS: TestReadPath/file_read_non_json_or_hcl_if_config-format_set (0.00s)
=== RUN   Test_UIPathBuilder
--- PASS: Test_UIPathBuilder (0.00s)
=== RUN   TestSegments
=== RUN   TestSegments/json:segment_name_not_in_OSS
=== RUN   TestSegments/hcl:segment_name_not_in_OSS
=== RUN   TestSegments/json:segment_port_must_be_set
=== RUN   TestSegments/hcl:segment_port_must_be_set
=== RUN   TestSegments/json:segments_not_in_OSS
=== RUN   TestSegments/hcl:segments_not_in_OSS
--- PASS: TestSegments (0.40s)
    --- PASS: TestSegments/json:segment_name_not_in_OSS (0.08s)
    --- PASS: TestSegments/hcl:segment_name_not_in_OSS (0.08s)
    --- PASS: TestSegments/json:segment_port_must_be_set (0.08s)
    --- PASS: TestSegments/hcl:segment_port_must_be_set (0.07s)
    --- PASS: TestSegments/json:segments_not_in_OSS (0.04s)
    --- PASS: TestSegments/hcl:segments_not_in_OSS (0.04s)
=== CONT  TestConfigDecodeBytes
--- PASS: TestConfigDecodeBytes (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/config	33.705s
=== RUN   TestCollectHostInfo
--- PASS: TestCollectHostInfo (0.15s)
PASS
ok  	github.com/hashicorp/consul/agent/debug	0.240s
?   	github.com/hashicorp/consul/agent/exec	[no test files]
=== RUN   TestAgentAntiEntropy_Services
--- SKIP: TestAgentAntiEntropy_Services (0.00s)
    state_test.go:30: DM-skipped
=== RUN   TestAgentAntiEntropy_Services_ConnectProxy
=== PAUSE TestAgentAntiEntropy_Services_ConnectProxy
=== RUN   TestAgent_ServiceWatchCh
=== PAUSE TestAgent_ServiceWatchCh
=== RUN   TestAgentAntiEntropy_EnableTagOverride
--- SKIP: TestAgentAntiEntropy_EnableTagOverride (0.00s)
    state_test.go:507: DM-skipped
=== RUN   TestAgentAntiEntropy_Services_WithChecks
=== PAUSE TestAgentAntiEntropy_Services_WithChecks
=== RUN   TestAgentAntiEntropy_Services_ACLDeny
=== PAUSE TestAgentAntiEntropy_Services_ACLDeny
=== RUN   TestAgentAntiEntropy_Checks
=== PAUSE TestAgentAntiEntropy_Checks
=== RUN   TestAgentAntiEntropy_Checks_ACLDeny
=== PAUSE TestAgentAntiEntropy_Checks_ACLDeny
=== RUN   TestAgent_UpdateCheck_DiscardOutput
--- SKIP: TestAgent_UpdateCheck_DiscardOutput (0.00s)
    state_test.go:1336: DM-skipped
=== RUN   TestAgentAntiEntropy_Check_DeferSync
--- SKIP: TestAgentAntiEntropy_Check_DeferSync (0.00s)
    state_test.go:1388: DM-skipped
=== RUN   TestAgentAntiEntropy_NodeInfo
=== PAUSE TestAgentAntiEntropy_NodeInfo
=== RUN   TestAgent_ServiceTokens
=== PAUSE TestAgent_ServiceTokens
=== RUN   TestAgent_CheckTokens
=== PAUSE TestAgent_CheckTokens
=== RUN   TestAgent_CheckCriticalTime
--- SKIP: TestAgent_CheckCriticalTime (0.00s)
    state_test.go:1707: DM-skipped
=== RUN   TestAgent_AddCheckFailure
=== PAUSE TestAgent_AddCheckFailure
=== RUN   TestAgent_AliasCheck
=== PAUSE TestAgent_AliasCheck
=== RUN   TestAgent_sendCoordinate
=== PAUSE TestAgent_sendCoordinate
=== RUN   TestState_Notify
=== PAUSE TestState_Notify
=== RUN   TestStateProxyManagement
=== PAUSE TestStateProxyManagement
=== RUN   TestStateProxyRestore
=== PAUSE TestStateProxyRestore
=== RUN   TestAliasNotifications_local
=== PAUSE TestAliasNotifications_local
=== CONT  TestAgentAntiEntropy_Services_ConnectProxy
=== CONT  TestAgent_AddCheckFailure
=== CONT  TestAliasNotifications_local
=== CONT  TestState_Notify
--- PASS: TestState_Notify (0.00s)
=== CONT  TestStateProxyRestore
--- PASS: TestStateProxyRestore (0.01s)
=== CONT  TestAgent_sendCoordinate
WARNING: bootstrap = true: do not enable unless necessary
TestAliasNotifications_local - 2019/12/30 18:51:06.040150 [WARN] agent: Node name "Node 2b2c2017-632e-163e-266a-e2db1e1c73b5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAliasNotifications_local - 2019/12/30 18:51:06.041056 [DEBUG] tlsutil: Update with version 1
TestAliasNotifications_local - 2019/12/30 18:51:06.047508 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:06.048299 [WARN] agent: Node name "Node ac493f99-78bd-7a1a-0f12-b47789ad7652" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:06.048712 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:06.051064 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_sendCoordinate - 2019/12/30 18:51:06.058951 [WARN] agent: Node name "Node e8722470-b066-f6f2-e32f-4f0edeb0c041" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
=== CONT  TestAgent_AliasCheck
TestAgent_sendCoordinate - 2019/12/30 18:51:06.061335 [DEBUG] tlsutil: Update with version 1
--- PASS: TestAgent_AddCheckFailure (0.22s)
TestAgent_sendCoordinate - 2019/12/30 18:51:06.070737 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
--- PASS: TestAgent_AliasCheck (0.11s)
=== CONT  TestStateProxyManagement
--- PASS: TestStateProxyManagement (0.00s)
=== CONT  TestAgentAntiEntropy_Checks_ACLDeny
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:06.282044 [WARN] agent: Node name "Node f6ee21bf-db14-af80-0ca3-8152a5808642" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:06.282589 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:06.284998 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2b2c2017-632e-163e-266a-e2db1e1c73b5 Address:127.0.0.1:50512}]
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50512 [Follower] entering Follower state (Leader: "")
2019/12/30 18:51:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e8722470-b066-f6f2-e32f-4f0edeb0c041 Address:127.0.0.1:50518}]
2019/12/30 18:51:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ac493f99-78bd-7a1a-0f12-b47789ad7652 Address:127.0.0.1:50506}]
TestAgent_sendCoordinate - 2019/12/30 18:51:07.064263 [INFO] serf: EventMemberJoin: Node e8722470-b066-f6f2-e32f-4f0edeb0c041.dc1 127.0.0.1
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50518 [Follower] entering Follower state (Leader: "")
TestAliasNotifications_local - 2019/12/30 18:51:07.072625 [INFO] serf: EventMemberJoin: Node 2b2c2017-632e-163e-266a-e2db1e1c73b5.dc1 127.0.0.1
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50506 [Follower] entering Follower state (Leader: "")
TestAliasNotifications_local - 2019/12/30 18:51:07.080523 [INFO] serf: EventMemberJoin: Node 2b2c2017-632e-163e-266a-e2db1e1c73b5 127.0.0.1
TestAgent_sendCoordinate - 2019/12/30 18:51:07.072656 [INFO] serf: EventMemberJoin: Node e8722470-b066-f6f2-e32f-4f0edeb0c041 127.0.0.1
TestAgent_sendCoordinate - 2019/12/30 18:51:07.083885 [INFO] agent: Started DNS server 127.0.0.1:50513 (tcp)
TestAgent_sendCoordinate - 2019/12/30 18:51:07.083968 [INFO] agent: Started DNS server 127.0.0.1:50513 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.087803 [INFO] serf: EventMemberJoin: Node ac493f99-78bd-7a1a-0f12-b47789ad7652.dc1 127.0.0.1
TestAgent_sendCoordinate - 2019/12/30 18:51:07.089921 [INFO] consul: Handled member-join event for server "Node e8722470-b066-f6f2-e32f-4f0edeb0c041.dc1" in area "wan"
TestAliasNotifications_local - 2019/12/30 18:51:07.090659 [INFO] consul: Adding LAN server Node 2b2c2017-632e-163e-266a-e2db1e1c73b5 (Addr: tcp/127.0.0.1:50512) (DC: dc1)
TestAliasNotifications_local - 2019/12/30 18:51:07.090941 [INFO] consul: Handled member-join event for server "Node 2b2c2017-632e-163e-266a-e2db1e1c73b5.dc1" in area "wan"
TestAliasNotifications_local - 2019/12/30 18:51:07.106715 [INFO] agent: Started DNS server 127.0.0.1:50507 (tcp)
TestAgent_sendCoordinate - 2019/12/30 18:51:07.108094 [INFO] agent: Started HTTP server on 127.0.0.1:50514 (tcp)
TestAgent_sendCoordinate - 2019/12/30 18:51:07.108235 [INFO] agent: started state syncer
TestAgent_sendCoordinate - 2019/12/30 18:51:07.110604 [INFO] consul: Adding LAN server Node e8722470-b066-f6f2-e32f-4f0edeb0c041 (Addr: tcp/127.0.0.1:50518) (DC: dc1)
2019/12/30 18:51:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50518 [Candidate] entering Candidate state in term 2
TestAliasNotifications_local - 2019/12/30 18:51:07.116152 [INFO] agent: Started DNS server 127.0.0.1:50507 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.117415 [INFO] serf: EventMemberJoin: Node ac493f99-78bd-7a1a-0f12-b47789ad7652 127.0.0.1
TestAliasNotifications_local - 2019/12/30 18:51:07.118469 [INFO] agent: Started HTTP server on 127.0.0.1:50508 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.118531 [INFO] agent: Started DNS server 127.0.0.1:50501 (udp)
TestAliasNotifications_local - 2019/12/30 18:51:07.118538 [INFO] agent: started state syncer
2019/12/30 18:51:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50512 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.119013 [INFO] consul: Handled member-join event for server "Node ac493f99-78bd-7a1a-0f12-b47789ad7652.dc1" in area "wan"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.119699 [INFO] consul: Adding LAN server Node ac493f99-78bd-7a1a-0f12-b47789ad7652 (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.119731 [INFO] agent: Started DNS server 127.0.0.1:50501 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.122366 [INFO] agent: Started HTTP server on 127.0.0.1:50502 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.122452 [INFO] agent: started state syncer
2019/12/30 18:51:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50506 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f6ee21bf-db14-af80-0ca3-8152a5808642 Address:127.0.0.1:50524}]
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50524 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.219475 [INFO] serf: EventMemberJoin: Node f6ee21bf-db14-af80-0ca3-8152a5808642.dc1 127.0.0.1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.225161 [INFO] serf: EventMemberJoin: Node f6ee21bf-db14-af80-0ca3-8152a5808642 127.0.0.1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.226520 [INFO] consul: Adding LAN server Node f6ee21bf-db14-af80-0ca3-8152a5808642 (Addr: tcp/127.0.0.1:50524) (DC: dc1)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.227407 [INFO] consul: Handled member-join event for server "Node f6ee21bf-db14-af80-0ca3-8152a5808642.dc1" in area "wan"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.229480 [INFO] agent: Started DNS server 127.0.0.1:50519 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.229876 [INFO] agent: Started DNS server 127.0.0.1:50519 (udp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.232630 [INFO] agent: Started HTTP server on 127.0.0.1:50520 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:07.232780 [INFO] agent: started state syncer
2019/12/30 18:51:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50524 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50506 [Leader] entering Leader state
2019/12/30 18:51:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50512 [Leader] entering Leader state
TestAliasNotifications_local - 2019/12/30 18:51:07.680312 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.680312 [INFO] consul: cluster leadership acquired
2019/12/30 18:51:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:07 [INFO]  raft: Node at 127.0.0.1:50518 [Leader] entering Leader state
TestAliasNotifications_local - 2019/12/30 18:51:07.681144 [INFO] consul: New leader elected: Node 2b2c2017-632e-163e-266a-e2db1e1c73b5
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:07.681498 [INFO] consul: New leader elected: Node ac493f99-78bd-7a1a-0f12-b47789ad7652
TestAgent_sendCoordinate - 2019/12/30 18:51:07.682025 [INFO] consul: cluster leadership acquired
TestAgent_sendCoordinate - 2019/12/30 18:51:07.682526 [INFO] consul: New leader elected: Node e8722470-b066-f6f2-e32f-4f0edeb0c041
2019/12/30 18:51:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:08 [INFO]  raft: Node at 127.0.0.1:50524 [Leader] entering Leader state
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.103486 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.103978 [INFO] consul: New leader elected: Node f6ee21bf-db14-af80-0ca3-8152a5808642
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.128599 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:08.320280 [INFO] agent: Synced node info
TestAgent_sendCoordinate - 2019/12/30 18:51:08.320491 [INFO] agent: Synced node info
TestAliasNotifications_local - 2019/12/30 18:51:08.321776 [INFO] agent: Synced node info
TestAliasNotifications_local - 2019/12/30 18:51:08.321872 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.511339 [INFO] acl: initializing acls
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.719981 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.720073 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.778925 [INFO] acl: initializing acls
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.779208 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:08.895734 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgent_sendCoordinate - 2019/12/30 18:51:08.992862 [INFO] agent: Requesting shutdown
TestAgent_sendCoordinate - 2019/12/30 18:51:08.992969 [INFO] consul: shutting down server
TestAgent_sendCoordinate - 2019/12/30 18:51:08.993018 [WARN] serf: Shutdown without a Leave
TestAgent_sendCoordinate - 2019/12/30 18:51:09.136056 [WARN] serf: Shutdown without a Leave
TestAgent_sendCoordinate - 2019/12/30 18:51:09.241729 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestAgent_sendCoordinate - 2019/12/30 18:51:09.241904 [INFO] manager: shutting down
TestAgent_sendCoordinate - 2019/12/30 18:51:09.242543 [INFO] agent: consul server down
TestAgent_sendCoordinate - 2019/12/30 18:51:09.242606 [INFO] agent: shutdown complete
TestAgent_sendCoordinate - 2019/12/30 18:51:09.242713 [INFO] agent: Stopping DNS server 127.0.0.1:50513 (tcp)
TestAgent_sendCoordinate - 2019/12/30 18:51:09.242874 [INFO] agent: Stopping DNS server 127.0.0.1:50513 (udp)
TestAgent_sendCoordinate - 2019/12/30 18:51:09.243041 [INFO] agent: Stopping HTTP server 127.0.0.1:50514 (tcp)
TestAgent_sendCoordinate - 2019/12/30 18:51:09.243319 [INFO] agent: Waiting for endpoints to shut down
TestAgent_sendCoordinate - 2019/12/30 18:51:09.243404 [INFO] agent: Endpoints down
--- PASS: TestAgent_sendCoordinate (3.39s)
    state_test.go:1859: 10 1 100ms
=== CONT  TestAgent_CheckTokens
=== CONT  TestAgent_ServiceTokens
--- PASS: TestAgent_CheckTokens (0.07s)
=== CONT  TestAgentAntiEntropy_NodeInfo
--- PASS: TestAgent_ServiceTokens (0.04s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:09.435761 [WARN] agent: Node name "Node 94531a53-adbc-a71c-2a8f-30a51a29f757" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.437584 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.437626 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.440057 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.440669 [INFO] serf: EventMemberUpdate: Node f6ee21bf-db14-af80-0ca3-8152a5808642
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.441311 [INFO] serf: EventMemberUpdate: Node f6ee21bf-db14-af80-0ca3-8152a5808642.dc1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.441673 [INFO] serf: EventMemberUpdate: Node f6ee21bf-db14-af80-0ca3-8152a5808642
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.442282 [INFO] serf: EventMemberUpdate: Node f6ee21bf-db14-af80-0ca3-8152a5808642.dc1
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:09.445634 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:09.450879 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAliasNotifications_local - 2019/12/30 18:51:09.483118 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.758894 [WARN] agent: Node info update blocked by ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:09.759011 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:09.905559 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAliasNotifications_local - 2019/12/30 18:51:09.906131 [DEBUG] consul: Skipping self join check for "Node 2b2c2017-632e-163e-266a-e2db1e1c73b5" since the cluster is too small
TestAliasNotifications_local - 2019/12/30 18:51:09.906320 [INFO] consul: member 'Node 2b2c2017-632e-163e-266a-e2db1e1c73b5' joined, marking health alive
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:09.905577 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:09.908664 [DEBUG] consul: Skipping self join check for "Node ac493f99-78bd-7a1a-0f12-b47789ad7652" since the cluster is too small
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:09.908840 [INFO] consul: member 'Node ac493f99-78bd-7a1a-0f12-b47789ad7652' joined, marking health alive
TestAliasNotifications_local - 2019/12/30 18:51:10.728075 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAliasNotifications_local - 2019/12/30 18:51:10.729808 [INFO] agent: Synced service "socat"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:10.849110 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:10.999758 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAliasNotifications_local - 2019/12/30 18:51:11.024148 [INFO] agent: Synced service "socat-sidecar-proxy"
TestAliasNotifications_local - 2019/12/30 18:51:11.024256 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024303 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024337 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024702 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024769 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024828 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024887 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.024930 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.026997 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.027083 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.027140 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.027188 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
2019/12/30 18:51:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:40e4a748-2192-161a-0510-9bf59fe950b5 Address:127.0.0.1:50530}]
2019/12/30 18:51:11 [INFO]  raft: Node at 127.0.0.1:50530 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.182115 [INFO] serf: EventMemberJoin: Node 94531a53-adbc-a71c-2a8f-30a51a29f757.dc1 127.0.0.1
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.191852 [INFO] serf: EventMemberJoin: Node 94531a53-adbc-a71c-2a8f-30a51a29f757 127.0.0.1
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.192836 [INFO] consul: Handled member-join event for server "Node 94531a53-adbc-a71c-2a8f-30a51a29f757.dc1" in area "wan"
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.193146 [INFO] consul: Adding LAN server Node 94531a53-adbc-a71c-2a8f-30a51a29f757 (Addr: tcp/127.0.0.1:50530) (DC: dc1)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.196624 [INFO] agent: Started DNS server 127.0.0.1:50525 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.196776 [INFO] agent: Started DNS server 127.0.0.1:50525 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.200238 [INFO] agent: Started HTTP server on 127.0.0.1:50526 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:11.200371 [INFO] agent: started state syncer
2019/12/30 18:51:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:11 [INFO]  raft: Node at 127.0.0.1:50530 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:11.330499 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAliasNotifications_local - 2019/12/30 18:51:11.330652 [INFO] agent: Synced check "service:socat-maintenance"
TestAliasNotifications_local - 2019/12/30 18:51:11.330698 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:11.330942 [DEBUG] consul: Skipping self join check for "Node f6ee21bf-db14-af80-0ca3-8152a5808642" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:11.331048 [INFO] consul: member 'Node f6ee21bf-db14-af80-0ca3-8152a5808642' joined, marking health alive
TestAliasNotifications_local - 2019/12/30 18:51:11.331288 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.331354 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.331404 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:11.531331 [INFO] agent: Synced service "redis-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:11.531426 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:11.531465 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:11.531549 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:11.531593 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:11.531625 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.704981 [INFO] agent: Synced check "service:socat-sidecar-proxy:2"
TestAliasNotifications_local - 2019/12/30 18:51:11.705086 [DEBUG] agent: Check "service:socat-maintenance" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.705126 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.705923 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:11.705992 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:11.706850 [DEBUG] consul: Skipping self join check for "Node f6ee21bf-db14-af80-0ca3-8152a5808642" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:11.707407 [DEBUG] consul: Skipping self join check for "Node f6ee21bf-db14-af80-0ca3-8152a5808642" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:11.708274 [DEBUG] consul: dropping node "Node f6ee21bf-db14-af80-0ca3-8152a5808642" from result due to ACLs
TestAliasNotifications_local - 2019/12/30 18:51:12.055700 [INFO] agent: Synced check "service:socat-sidecar-proxy:2"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:12.057200 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:12.059078 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:12.209667 [INFO] agent: Synced service "web-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:12.209756 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:12.209803 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:12.209837 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:12.210384 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:12.210449 [DEBUG] agent: Service "web-proxy" in sync
2019/12/30 18:51:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:12 [INFO]  raft: Node at 127.0.0.1:50530 [Leader] entering Leader state
TestAliasNotifications_local - 2019/12/30 18:51:12.390045 [INFO] agent: Deregistered check "service:socat-maintenance"
TestAliasNotifications_local - 2019/12/30 18:51:12.390146 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.390198 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.390382 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.390436 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.390481 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.390546 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.390581 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:12.390722 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:12.391120 [INFO] consul: New leader elected: Node 94531a53-adbc-a71c-2a8f-30a51a29f757
TestAliasNotifications_local - 2019/12/30 18:51:12.391455 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.391512 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/30 18:51:12.734112 [INFO] agent: Synced check "service:socat-tcp"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:12.734102 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:13.120548 [INFO] agent: Synced node info
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.120835 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.120894 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.121620 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.121685 [DEBUG] agent: Service "api" in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.126678 [INFO] agent: Synced check "service:socat-sidecar-proxy:2"
TestAliasNotifications_local - 2019/12/30 18:51:13.126762 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.126923 [INFO] agent: Requesting shutdown
TestAliasNotifications_local - 2019/12/30 18:51:13.126954 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.127038 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.127094 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.127148 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.127180 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/30 18:51:13.127322 [INFO] consul: shutting down server
TestAliasNotifications_local - 2019/12/30 18:51:13.127377 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.130117 [WARN] agent: Check "mysql-check" registration blocked by ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.130207 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.131050 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAliasNotifications_local - 2019/12/30 18:51:13.227844 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.231886 [INFO] agent: Synced service "cache-proxy"
TestAliasNotifications_local - 2019/12/30 18:51:13.353914 [INFO] manager: shutting down
TestAliasNotifications_local - 2019/12/30 18:51:13.354783 [INFO] agent: consul server down
TestAliasNotifications_local - 2019/12/30 18:51:13.354844 [INFO] agent: shutdown complete
TestAliasNotifications_local - 2019/12/30 18:51:13.354899 [INFO] agent: Stopping DNS server 127.0.0.1:50507 (tcp)
TestAliasNotifications_local - 2019/12/30 18:51:13.355081 [INFO] agent: Stopping DNS server 127.0.0.1:50507 (udp)
TestAliasNotifications_local - 2019/12/30 18:51:13.355316 [INFO] agent: Stopping HTTP server 127.0.0.1:50508 (tcp)
TestAliasNotifications_local - 2019/12/30 18:51:13.355538 [INFO] agent: Waiting for endpoints to shut down
TestAliasNotifications_local - 2019/12/30 18:51:13.355614 [INFO] agent: Endpoints down
--- PASS: TestAliasNotifications_local (7.52s)
=== CONT  TestAgentAntiEntropy_Services_ACLDeny
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.358740 [INFO] agent: Synced service "mysql"
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:13.518410 [WARN] agent: Node name "Node 54ec4706-15bc-e317-8e39-1e4233613bb8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:13.522320 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:13.524933 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.640122 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:13.640578 [WARN] agent: Check "mysql-check" registration blocked by ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.895798 [INFO] agent: Deregistered service "lb-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.895876 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.895924 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.895957 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.897396 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:13.897468 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.088257 [INFO] agent: Synced check "api-check"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.088342 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.089509 [DEBUG] consul: dropping check "api-check" from result due to ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.089603 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.389218 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.516501 [INFO] agent: Deregistered service "cache-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.516581 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.516622 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.516654 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517414 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517499 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517546 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517417 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517582 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517676 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517724 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517737 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517879 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517919 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.517949 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.794793 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.912300 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.912295 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:14.912843 [WARN] agent: Check "mysql-check" registration blocked by ACLs
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:14.915209 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.917386 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.917475 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.917549 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.917698 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.917896 [INFO] agent: Stopping HTTP server 127.0.0.1:50502 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.918164 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/30 18:51:14.918235 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_ConnectProxy (9.09s)
=== CONT  TestAgentAntiEntropy_Checks
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:14.982476 [WARN] agent: Node name "Node c7e2edea-8955-0f94-daae-282f924bd3e0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:14.982965 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:14.985832 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.195956 [INFO] agent: Deregistered check "api-check"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.196045 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.196546 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.196628 [INFO] consul: shutting down server
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.196680 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.196999 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197060 [DEBUG] agent: Service "api" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197117 [DEBUG] agent: Check "mysql-check" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197150 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197235 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197281 [DEBUG] agent: Service "api" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197328 [DEBUG] agent: Check "mysql-check" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.197364 [DEBUG] agent: Node info in sync
2019/12/30 18:51:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:54ec4706-15bc-e317-8e39-1e4233613bb8 Address:127.0.0.1:50536}]
2019/12/30 18:51:15 [INFO]  raft: Node at 127.0.0.1:50536 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.206377 [INFO] serf: EventMemberJoin: Node 54ec4706-15bc-e317-8e39-1e4233613bb8.dc1 127.0.0.1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.218759 [INFO] serf: EventMemberJoin: Node 54ec4706-15bc-e317-8e39-1e4233613bb8 127.0.0.1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.219729 [INFO] consul: Adding LAN server Node 54ec4706-15bc-e317-8e39-1e4233613bb8 (Addr: tcp/127.0.0.1:50536) (DC: dc1)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.220075 [INFO] consul: Handled member-join event for server "Node 54ec4706-15bc-e317-8e39-1e4233613bb8.dc1" in area "wan"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.220280 [INFO] agent: Started DNS server 127.0.0.1:50531 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.220593 [INFO] agent: Started DNS server 127.0.0.1:50531 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.222948 [INFO] agent: Started HTTP server on 127.0.0.1:50532 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:15.223040 [INFO] agent: started state syncer
2019/12/30 18:51:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:15 [INFO]  raft: Node at 127.0.0.1:50536 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.340930 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.488216 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.488743 [INFO] agent: Synced node info
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.488877 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.489026 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.489310 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.489376 [INFO] consul: shutting down server
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.489481 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.490356 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.490915 [INFO] agent: consul server down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.490970 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.491023 [INFO] agent: Stopping DNS server 127.0.0.1:50519 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.491148 [INFO] agent: Stopping DNS server 127.0.0.1:50519 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.491272 [WARN] consul: error getting server health from "Node 94531a53-adbc-a71c-2a8f-30a51a29f757": rpc error making call: EOF
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.491284 [INFO] agent: Stopping HTTP server 127.0.0.1:50520 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.491480 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/30 18:51:15.491550 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Checks_ACLDeny (9.32s)
=== CONT  TestAgentAntiEntropy_Services_WithChecks
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:15.574584 [WARN] agent: Node name "Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:15.575035 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:15.577361 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:15.795042 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.488424 [WARN] consul: error getting server health from "Node 94531a53-adbc-a71c-2a8f-30a51a29f757": context deadline exceeded
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.737373 [INFO] manager: shutting down
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.737380 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.737819 [INFO] agent: consul server down
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.737863 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.738168 [ERR] consul: failed to reconcile member: {Node 94531a53-adbc-a71c-2a8f-30a51a29f757 127.0.0.1 50528 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:40e4a748-2192-161a-0510-9bf59fe950b5 port:50530 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:50529] alive 1 5 2 2 5 4}: raft is already shutdown
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.737878 [INFO] agent: shutdown complete
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.738500 [INFO] agent: Stopping DNS server 127.0.0.1:50525 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.738687 [INFO] agent: Stopping DNS server 127.0.0.1:50525 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.738873 [INFO] agent: Stopping HTTP server 127.0.0.1:50526 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.739107 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_NodeInfo - 2019/12/30 18:51:16.739194 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_NodeInfo (7.38s)
=== CONT  TestAgent_ServiceWatchCh
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ServiceWatchCh - 2019/12/30 18:51:16.827189 [WARN] agent: Node name "Node 8b4a513b-053c-c964-538d-ac4870bd82ef" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ServiceWatchCh - 2019/12/30 18:51:16.835363 [DEBUG] tlsutil: Update with version 1
TestAgent_ServiceWatchCh - 2019/12/30 18:51:16.840324 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:16 [INFO]  raft: Node at 127.0.0.1:50536 [Leader] entering Leader state
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:16.971800 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:16.972251 [INFO] consul: New leader elected: Node 54ec4706-15bc-e317-8e39-1e4233613bb8
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:17.109092 [ERR] agent: failed to sync remote state: ACL not found
2019/12/30 18:51:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c7e2edea-8955-0f94-daae-282f924bd3e0 Address:127.0.0.1:50542}]
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.424198 [INFO] serf: EventMemberJoin: Node c7e2edea-8955-0f94-daae-282f924bd3e0.dc1 127.0.0.1
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:50542 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.430906 [INFO] serf: EventMemberJoin: Node c7e2edea-8955-0f94-daae-282f924bd3e0 127.0.0.1
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.431795 [INFO] consul: Handled member-join event for server "Node c7e2edea-8955-0f94-daae-282f924bd3e0.dc1" in area "wan"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.432092 [INFO] consul: Adding LAN server Node c7e2edea-8955-0f94-daae-282f924bd3e0 (Addr: tcp/127.0.0.1:50542) (DC: dc1)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.433628 [INFO] agent: Started DNS server 127.0.0.1:50537 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.434077 [INFO] agent: Started DNS server 127.0.0.1:50537 (udp)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.436565 [INFO] agent: Started HTTP server on 127.0.0.1:50538 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:17.436682 [INFO] agent: started state syncer
2019/12/30 18:51:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:50542 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:17.583321 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:17.966112 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:17.966185 [WARN] consul: Configuring a non-UUID master token is deprecated
2019/12/30 18:51:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:74a03506-dfa8-aa29-508a-ce5fc89cd6cb Address:127.0.0.1:50548}]
2019/12/30 18:51:17 [INFO]  raft: Node at 127.0.0.1:50548 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:17.974118 [INFO] serf: EventMemberJoin: Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb.dc1 127.0.0.1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:17.982355 [INFO] serf: EventMemberJoin: Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb 127.0.0.1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.001482 [INFO] consul: Handled member-join event for server "Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb.dc1" in area "wan"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.006342 [INFO] consul: Adding LAN server Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb (Addr: tcp/127.0.0.1:50548) (DC: dc1)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.008318 [INFO] agent: Started DNS server 127.0.0.1:50543 (udp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.008437 [INFO] agent: Started DNS server 127.0.0.1:50543 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.012940 [INFO] agent: Started HTTP server on 127.0.0.1:50544 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.013070 [INFO] agent: started state syncer
2019/12/30 18:51:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:18 [INFO]  raft: Node at 127.0.0.1:50548 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:18.313494 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:18.372066 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:18.372224 [WARN] consul: Configuring a non-UUID master token is deprecated
2019/12/30 18:51:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:18 [INFO]  raft: Node at 127.0.0.1:50542 [Leader] entering Leader state
2019/12/30 18:51:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8b4a513b-053c-c964-538d-ac4870bd82ef Address:127.0.0.1:50554}]
2019/12/30 18:51:18 [INFO]  raft: Node at 127.0.0.1:50554 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:18.425653 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:18.426139 [INFO] consul: New leader elected: Node c7e2edea-8955-0f94-daae-282f924bd3e0
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.427918 [INFO] serf: EventMemberJoin: Node 8b4a513b-053c-c964-538d-ac4870bd82ef.dc1 127.0.0.1
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.432898 [INFO] serf: EventMemberJoin: Node 8b4a513b-053c-c964-538d-ac4870bd82ef 127.0.0.1
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.433823 [INFO] consul: Adding LAN server Node 8b4a513b-053c-c964-538d-ac4870bd82ef (Addr: tcp/127.0.0.1:50554) (DC: dc1)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.434215 [INFO] consul: Handled member-join event for server "Node 8b4a513b-053c-c964-538d-ac4870bd82ef.dc1" in area "wan"
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.435405 [INFO] agent: Started DNS server 127.0.0.1:50549 (tcp)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.435814 [INFO] agent: Started DNS server 127.0.0.1:50549 (udp)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.438070 [INFO] agent: Started HTTP server on 127.0.0.1:50550 (tcp)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:18.438153 [INFO] agent: started state syncer
2019/12/30 18:51:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:18 [INFO]  raft: Node at 127.0.0.1:50554 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:18 [INFO]  raft: Node at 127.0.0.1:50548 [Leader] entering Leader state
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.889319 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:18.889861 [INFO] consul: New leader elected: Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:18.995758 [INFO] agent: Synced node info
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:18.995910 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:18.996943 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:18.997057 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:18.998089 [INFO] serf: EventMemberUpdate: Node 54ec4706-15bc-e317-8e39-1e4233613bb8
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:19.000063 [INFO] serf: EventMemberUpdate: Node 54ec4706-15bc-e317-8e39-1e4233613bb8.dc1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:19.000352 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:19.001259 [INFO] serf: EventMemberUpdate: Node 54ec4706-15bc-e317-8e39-1e4233613bb8
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:19.002086 [INFO] serf: EventMemberUpdate: Node 54ec4706-15bc-e317-8e39-1e4233613bb8.dc1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:19.188633 [WARN] agent: Node info update blocked by ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:19.188783 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:19.313152 [INFO] agent: Synced node info
2019/12/30 18:51:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:19 [INFO]  raft: Node at 127.0.0.1:50554 [Leader] entering Leader state
TestAgent_ServiceWatchCh - 2019/12/30 18:51:19.439949 [INFO] consul: cluster leadership acquired
TestAgent_ServiceWatchCh - 2019/12/30 18:51:19.440402 [INFO] consul: New leader elected: Node 8b4a513b-053c-c964-538d-ac4870bd82ef
TestAgent_ServiceWatchCh - 2019/12/30 18:51:20.155240 [INFO] agent: Synced node info
TestAgent_ServiceWatchCh - 2019/12/30 18:51:20.155367 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:20.441691 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:20.470378 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:20.470862 [DEBUG] consul: Skipping self join check for "Node 54ec4706-15bc-e317-8e39-1e4233613bb8" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:20.470962 [INFO] consul: member 'Node 54ec4706-15bc-e317-8e39-1e4233613bb8' joined, marking health alive
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:20.679584 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:20.680152 [DEBUG] consul: Skipping self join check for "Node c7e2edea-8955-0f94-daae-282f924bd3e0" since the cluster is too small
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:20.680331 [INFO] consul: member 'Node c7e2edea-8955-0f94-daae-282f924bd3e0' joined, marking health alive
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:20.795268 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:20.795782 [DEBUG] consul: Skipping self join check for "Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb" since the cluster is too small
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:20.795992 [INFO] consul: member 'Node 74a03506-dfa8-aa29-508a-ce5fc89cd6cb' joined, marking health alive
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:20.800624 [DEBUG] consul: Skipping self join check for "Node 54ec4706-15bc-e317-8e39-1e4233613bb8" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:20.801261 [DEBUG] consul: Skipping self join check for "Node 54ec4706-15bc-e317-8e39-1e4233613bb8" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:20.817012 [DEBUG] consul: dropping node "Node 54ec4706-15bc-e317-8e39-1e4233613bb8" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.204853 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.206129 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgent_ServiceWatchCh - 2019/12/30 18:51:21.216796 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:21.225830 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:21.346542 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:21.728365 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:21.731140 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:21.731230 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:21.731273 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:21.732080 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:21.738699 [INFO] agent: Synced check "mysql"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:21.738791 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:21.740834 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.954351 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.956760 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.956818 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.956945 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.956989 [DEBUG] agent: Service "api" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.957024 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:21.957459 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.045882 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.048728 [DEBUG] consul: Skipping self join check for "Node 8b4a513b-053c-c964-538d-ac4870bd82ef" since the cluster is too small
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.049724 [INFO] consul: member 'Node 8b4a513b-053c-c964-538d-ac4870bd82ef' joined, marking health alive
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.076928 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.255068 [INFO] agent: Synced check "redis"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.255151 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.260156 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.260241 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.263270 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.263676 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.372896 [INFO] agent: Synced service "redis"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.372993 [DEBUG] agent: Check "redis:2" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.373041 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.373082 [DEBUG] agent: Check "redis:1" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.373125 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.373439 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.373475 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.374365 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.374495 [WARN] serf: Shutdown without a Leave
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.374754 [INFO] agent: Requesting shutdown
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.374823 [INFO] consul: shutting down server
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.374877 [WARN] serf: Shutdown without a Leave
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.478067 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.478122 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.594895 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.595956 [INFO] agent: Deregistered service "api"
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.596514 [INFO] agent: Deregistered service "svc_id1"
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.596609 [DEBUG] agent: Node info in sync
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.596901 [DEBUG] agent: Node info in sync
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.597352 [INFO] manager: shutting down
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598058 [INFO] agent: consul server down
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598129 [INFO] agent: shutdown complete
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598210 [INFO] agent: Stopping DNS server 127.0.0.1:50549 (tcp)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598388 [INFO] agent: Stopping DNS server 127.0.0.1:50549 (udp)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598570 [INFO] agent: Stopping HTTP server 127.0.0.1:50550 (tcp)
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598820 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ServiceWatchCh - 2019/12/30 18:51:22.598925 [INFO] agent: Endpoints down
--- PASS: TestAgent_ServiceWatchCh (5.86s)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.600143 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.600156 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.600548 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.600743 [INFO] agent: Stopping DNS server 127.0.0.1:50543 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.600387 [WARN] agent: Syncing service "redis" failed. No cluster leader
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.601211 [ERR] agent: failed to sync remote state: No cluster leader
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.601311 [INFO] agent: Stopping DNS server 127.0.0.1:50543 (udp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.601695 [INFO] agent: Stopping HTTP server 127.0.0.1:50544 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.602256 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_WithChecks - 2019/12/30 18:51:22.602479 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_WithChecks (7.11s)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.730900 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.730988 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731032 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731063 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731340 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731402 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731489 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731578 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731618 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.731951 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.912263 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.912624 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.912689 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.912961 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.913043 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:22.913093 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:22.916240 [INFO] agent: Synced check "cache"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.019802 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.169941 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.170690 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.170758 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.170825 [INFO] agent: Stopping DNS server 127.0.0.1:50531 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.171017 [INFO] agent: Stopping DNS server 127.0.0.1:50531 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.171210 [INFO] agent: Stopping HTTP server 127.0.0.1:50532 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.171461 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/30 18:51:23.171519 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_ACLDeny (9.82s)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.173065 [INFO] agent: Deregistered check "lb"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.173141 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.173200 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.173245 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.173290 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.174342 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.429878 [INFO] agent: Deregistered check "redis"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.429965 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.430009 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.430040 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.430289 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.430382 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.712934 [INFO] agent: Deregistered check "redis"
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.713066 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:23.713115 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.029629 [INFO] agent: Synced node info
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.030230 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.030407 [INFO] consul: shutting down server
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.030535 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.170364 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.294884 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.297154 [INFO] agent: consul server down
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.297323 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.297480 [INFO] agent: Stopping DNS server 127.0.0.1:50537 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.297933 [INFO] agent: Stopping DNS server 127.0.0.1:50537 (udp)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.298300 [INFO] agent: Stopping HTTP server 127.0.0.1:50538 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.298779 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Checks - 2019/12/30 18:51:24.298959 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Checks (9.38s)
PASS
ok  	github.com/hashicorp/consul/agent/local	18.791s
=== RUN   TestBuild
=== RUN   TestBuild/no_version
=== RUN   TestBuild/bad_version
=== RUN   TestBuild/good_version
=== RUN   TestBuild/rc_version
=== RUN   TestBuild/ent_version
--- PASS: TestBuild (0.01s)
    --- PASS: TestBuild/no_version (0.00s)
    --- PASS: TestBuild/bad_version (0.00s)
    --- PASS: TestBuild/good_version (0.00s)
    --- PASS: TestBuild/rc_version (0.00s)
    --- PASS: TestBuild/ent_version (0.00s)
=== RUN   TestServer_Key_Equal
--- PASS: TestServer_Key_Equal (0.00s)
=== RUN   TestServer_Key
--- PASS: TestServer_Key (0.00s)
=== RUN   TestServer_Key_params
--- PASS: TestServer_Key_params (0.00s)
=== RUN   TestIsConsulServer
--- PASS: TestIsConsulServer (0.00s)
=== RUN   TestIsConsulServer_Optional
--- PASS: TestIsConsulServer_Optional (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/metadata	0.125s
?   	github.com/hashicorp/consul/agent/mock	[no test files]
?   	github.com/hashicorp/consul/agent/pool	[no test files]
=== RUN   TestManager_BasicLifecycle
--- SKIP: TestManager_BasicLifecycle (0.00s)
    manager_test.go:42: DM-skipped
=== RUN   TestManager_deliverLatest
--- PASS: TestManager_deliverLatest (0.00s)
=== RUN   TestStateChanged
=== RUN   TestStateChanged/nil_node_service
=== RUN   TestStateChanged/same_service
=== RUN   TestStateChanged/same_service,_different_token
=== RUN   TestStateChanged/different_service_ID
=== RUN   TestStateChanged/different_address
=== RUN   TestStateChanged/different_port
=== RUN   TestStateChanged/different_service_kind
=== RUN   TestStateChanged/different_proxy_target
=== RUN   TestStateChanged/different_proxy_upstreams
--- PASS: TestStateChanged (0.01s)
    --- PASS: TestStateChanged/nil_node_service (0.00s)
    --- PASS: TestStateChanged/same_service (0.00s)
    --- PASS: TestStateChanged/same_service,_different_token (0.00s)
    --- PASS: TestStateChanged/different_service_ID (0.00s)
    --- PASS: TestStateChanged/different_address (0.00s)
    --- PASS: TestStateChanged/different_port (0.00s)
    --- PASS: TestStateChanged/different_service_kind (0.00s)
    --- PASS: TestStateChanged/different_proxy_target (0.00s)
    --- PASS: TestStateChanged/different_proxy_upstreams (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/proxycfg	0.328s
=== RUN   TestDaemon_impl
--- PASS: TestDaemon_impl (0.00s)
=== RUN   TestDaemonStartStop
=== PAUSE TestDaemonStartStop
=== RUN   TestDaemonRestart
=== PAUSE TestDaemonRestart
=== RUN   TestDaemonLaunchesNewProcessGroup
=== PAUSE TestDaemonLaunchesNewProcessGroup
=== RUN   TestDaemonStop_kill
=== PAUSE TestDaemonStop_kill
=== RUN   TestDaemonStop_killAdopted
=== PAUSE TestDaemonStop_killAdopted
=== RUN   TestDaemonStart_pidFile
=== PAUSE TestDaemonStart_pidFile
=== RUN   TestDaemonRestart_pidFile
=== PAUSE TestDaemonRestart_pidFile
=== RUN   TestDaemonEqual
=== RUN   TestDaemonEqual/Different_type
=== RUN   TestDaemonEqual/Nil
=== RUN   TestDaemonEqual/Equal
=== RUN   TestDaemonEqual/Different_proxy_ID
=== RUN   TestDaemonEqual/Different_path
=== RUN   TestDaemonEqual/Different_dir
=== RUN   TestDaemonEqual/Different_args
=== RUN   TestDaemonEqual/Different_token
--- PASS: TestDaemonEqual (0.00s)
    --- PASS: TestDaemonEqual/Different_type (0.00s)
    --- PASS: TestDaemonEqual/Nil (0.00s)
    --- PASS: TestDaemonEqual/Equal (0.00s)
    --- PASS: TestDaemonEqual/Different_proxy_ID (0.00s)
    --- PASS: TestDaemonEqual/Different_path (0.00s)
    --- PASS: TestDaemonEqual/Different_dir (0.00s)
    --- PASS: TestDaemonEqual/Different_args (0.00s)
    --- PASS: TestDaemonEqual/Different_token (0.00s)
=== RUN   TestDaemonMarshalSnapshot
=== RUN   TestDaemonMarshalSnapshot/stopped_daemon
=== RUN   TestDaemonMarshalSnapshot/basic
--- PASS: TestDaemonMarshalSnapshot (0.00s)
    --- PASS: TestDaemonMarshalSnapshot/stopped_daemon (0.00s)
    --- PASS: TestDaemonMarshalSnapshot/basic (0.00s)
=== RUN   TestDaemonUnmarshalSnapshot
=== PAUSE TestDaemonUnmarshalSnapshot
=== RUN   TestDaemonUnmarshalSnapshot_notRunning
=== PAUSE TestDaemonUnmarshalSnapshot_notRunning
=== RUN   TestManagerClose_noRun
=== PAUSE TestManagerClose_noRun
=== RUN   TestManagerRun_initialSync
=== PAUSE TestManagerRun_initialSync
=== RUN   TestManagerRun_syncNew
=== PAUSE TestManagerRun_syncNew
=== RUN   TestManagerRun_syncDelete
=== PAUSE TestManagerRun_syncDelete
=== RUN   TestManagerRun_syncUpdate
=== PAUSE TestManagerRun_syncUpdate
=== RUN   TestManagerRun_daemonLogs
=== PAUSE TestManagerRun_daemonLogs
=== RUN   TestManagerRun_daemonPid
--- SKIP: TestManagerRun_daemonPid (0.00s)
    manager_test.go:262: DM-skipped
=== RUN   TestManagerPassesEnvironment
=== PAUSE TestManagerPassesEnvironment
=== RUN   TestManagerPassesProxyEnv
--- SKIP: TestManagerPassesProxyEnv (0.01s)
    manager_test.go:354: DM-skipped
=== RUN   TestManagerRun_snapshotRestore
=== PAUSE TestManagerRun_snapshotRestore
=== RUN   TestManagerRun_rootDisallow
2019/12/30 18:50:41 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:41 [WARN] agent/proxy: running as root, will not start managed proxies
2019/12/30 18:50:41 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
--- PASS: TestManagerRun_rootDisallow (0.10s)
=== RUN   TestNoop_impl
--- PASS: TestNoop_impl (0.00s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== CONT  TestDaemonStartStop
=== CONT  TestManagerClose_noRun
=== CONT  TestDaemonUnmarshalSnapshot_notRunning
--- PASS: TestManagerClose_noRun (0.00s)
=== CONT  TestManagerRun_snapshotRestore
=== CONT  TestDaemonUnmarshalSnapshot
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy283238703/file"}
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy448999225/file"}
2019/12/30 18:50:41 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy727263187/file"}
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy561500438/file"}
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonStartStop (0.12s)
=== CONT  TestDaemonStop_kill
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "stop-kill", "/tmp/test-agent-proxy402216317/file"}
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonUnmarshalSnapshot_notRunning (0.16s)
=== CONT  TestDaemonRestart_pidFile
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy626858168/file"}
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:41 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/30 18:50:41 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy727263187/file2"}
2019/12/30 18:50:41 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy626858168/file"}
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: graceful wait of 200ms passed, killing
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon left running
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonRestart_pidFile (0.28s)
=== CONT  TestDaemonStart_pidFile
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-once", "/tmp/test-agent-proxy458018986/file"}
Unknown command: "start-once"
--- PASS: TestDaemonStop_kill (0.38s)
=== CONT  TestDaemonStop_killAdopted
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 2
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-once", "/tmp/test-agent-proxy458018986/file"}
Unknown command: "start-once"
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 2
--- PASS: TestDaemonStart_pidFile (0.14s)
=== CONT  TestDaemonRestart
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy244265836/file"}
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy244265836/file"}
logger: 2019/12/30 18:50:41 [DEBUG] agent/proxy: graceful wait of 200ms passed, killing
logger: 2019/12/30 18:50:41 [INFO] agent/proxy: daemon exited with exit code: 0
=== CONT  TestDaemonLaunchesNewProcessGroup
--- PASS: TestDaemonRestart (0.17s)
logger: 2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy273564123/file"}
2019/12/30 18:50:42 Started child
--- PASS: TestDaemonStop_killAdopted (0.36s)
=== CONT  TestManagerRun_syncDelete
2019/12/30 18:50:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy168953541/file"}
2019/12/30 18:50:42 Started child
logger: 2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy273564123/file"}
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
--- PASS: TestManagerRun_syncDelete (0.18s)
=== CONT  TestManagerPassesEnvironment
2019/12/30 18:50:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "environ", "/tmp/test-agent-proxy320043839/env-variables"}
--- PASS: TestDaemonLaunchesNewProcessGroup (0.33s)
    daemon_test.go:224: Child PID was 23722 and still 23722
    daemon_test.go:241: Child PID was 23722 and is now 23744
=== CONT  TestManagerRun_daemonLogs
2019/12/30 18:50:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "output", "/tmp/test-agent-proxy011423634/notify"}
2019/12/30 18:50:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
logger: 2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with error: process 23611 is dead or running as another user
--- PASS: TestDaemonUnmarshalSnapshot (1.15s)
=== CONT  TestManagerRun_syncUpdate
2019/12/30 18:50:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy840616596/file"}
=== CONT  TestManagerRun_initialSync
2019/12/30 18:50:42 [DEBUG] agent/proxy: managed Connect proxy manager started
--- PASS: TestManagerPassesEnvironment (0.12s)
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy360915686/file"}
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with error: process 23607 is dead or running as another user
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestManagerRun_snapshotRestore (1.21s)
=== CONT  TestManagerRun_syncNew
2019/12/30 18:50:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/30 18:50:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy887913736/file"}
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy360915686/file"}
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy840616596/file2"}
2019/12/30 18:50:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy887913736/file2"}
2019/12/30 18:50:42 [INFO] agent/proxy: daemon left running
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 1
--- PASS: TestManagerRun_initialSync (0.27s)
--- PASS: TestManagerRun_daemonLogs (0.34s)
logger: 2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with error: process 23680 is dead or running as another user
2019/12/30 18:50:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/30 18:50:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build934878654/b748/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy887913736/file2"}
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestManagerRun_syncUpdate (0.38s)
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/30 18:50:42 [INFO] agent/proxy: daemon exited with exit code: 1
--- PASS: TestManagerRun_syncNew (0.39s)
PASS
ok  	github.com/hashicorp/consul/agent/proxyprocess	1.929s
=== RUN   TestManagerInternal_cycleServer
--- PASS: TestManagerInternal_cycleServer (0.00s)
=== RUN   TestManagerInternal_getServerList
--- PASS: TestManagerInternal_getServerList (0.00s)
=== RUN   TestManagerInternal_New
--- PASS: TestManagerInternal_New (0.00s)
=== RUN   TestManagerInternal_reconcileServerList
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [WARN] manager: No servers available
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s00 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s00 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 2 servers, next active server is s00 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s03 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s00 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s03 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 4 servers, next active server is s01 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 7 servers, next active server is s05 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 58 servers, next active server is s41 (Addr: /) (DC: )
--- PASS: TestManagerInternal_reconcileServerList (0.01s)
=== RUN   TestManagerInternal_refreshServerRebalanceTimer
--- PASS: TestManagerInternal_refreshServerRebalanceTimer (0.00s)
=== RUN   TestManagerInternal_saveServerList
--- PASS: TestManagerInternal_saveServerList (0.00s)
=== RUN   TestRouter_Shutdown
--- PASS: TestRouter_Shutdown (0.00s)
=== RUN   TestRouter_Routing
--- PASS: TestRouter_Routing (0.00s)
=== RUN   TestRouter_Routing_Offline
2019/12/30 18:51:00 [DEBUG] manager: pinging server "node1.dc1 (Addr: tcp/127.0.0.2:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "node3.dc1 (Addr: tcp/127.0.0.4:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "node2.dc1 (Addr: tcp/127.0.0.3:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "node4.dc1 (Addr: tcp/127.0.0.5:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
--- PASS: TestRouter_Routing_Offline (0.00s)
=== RUN   TestRouter_GetDatacenters
--- PASS: TestRouter_GetDatacenters (0.00s)
=== RUN   TestRouter_distanceSorter
--- PASS: TestRouter_distanceSorter (0.00s)
=== RUN   TestRouter_GetDatacentersByDistance
--- PASS: TestRouter_GetDatacentersByDistance (0.00s)
=== RUN   TestRouter_GetDatacenterMaps
--- PASS: TestRouter_GetDatacenterMaps (0.00s)
=== RUN   TestServers_AddServer
--- PASS: TestServers_AddServer (0.00s)
=== RUN   TestServers_IsOffline
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [INFO] manager: shutting down
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
--- PASS: TestServers_IsOffline (0.02s)
=== RUN   TestServers_FindServer
2019/12/30 18:51:00 [WARN] manager: No servers available
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s1"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s2"
--- PASS: TestServers_FindServer (0.00s)
=== RUN   TestServers_New
--- PASS: TestServers_New (0.00s)
=== RUN   TestServers_NotifyFailedServer
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s1"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s2"
--- PASS: TestServers_NotifyFailedServer (0.00s)
=== RUN   TestServers_NumServers
--- PASS: TestServers_NumServers (0.00s)
=== RUN   TestServers_RebalanceServers
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s47 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s34 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s67 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s82 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s33 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s95 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s32 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s59 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s64 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s36 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s82 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s94 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s56 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s59 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s45 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: pinging server "s95 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s20 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s07 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:00 [DEBUG] manager: Rebalanced 100 servers, next active server is s36 (Addr: /) (DC: )
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:00 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s04 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s29 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s72 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s97 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s01 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s87 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s50 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s70 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s11 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s77 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s25 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s40 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s22 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s65 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s58 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s04 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s19 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s64 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s83 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s68 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s80 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s85 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s46 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s83 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s36 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s46 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s01 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s07 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s67 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s58 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s24 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s61 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s22 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s65 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s98 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s41 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s51 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s52 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s97 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s62 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s25 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s41 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s26 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s71 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s87 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s08 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s02 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s59 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s16 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s69 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s74 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s22 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s63 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s55 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s23 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s45 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s68 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s45 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s63 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s51 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s37 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s64 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s78 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s31 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s80 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s26 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s00 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s90 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s38 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s06 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s78 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s96 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s60 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s05 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s54 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s90 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s75 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s32 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s68 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s85 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s12 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s78 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s95 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s71 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s15 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s44 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s52 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s03 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s97 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: pinging server "s94 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s79 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s88 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s73 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s50 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s06 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:01 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s25 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s36 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s71 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s56 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s12 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s96 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s40 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s92 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s75 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s37 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s47 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s90 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s29 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s02 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s73 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s37 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s49 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s53 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s55 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s66 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s74 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s34 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s23 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s53 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s53 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s76 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s07 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s23 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s47 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s75 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s73 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s13 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s45 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s43 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s24 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s28 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s99 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s30 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s23 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s77 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s78 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s37 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s00 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s46 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s93 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s36 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s80 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s59 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s82 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s99 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s43 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s54 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s08 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s48 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s95 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s97 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s87 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s83 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s83"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s26"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s95"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s00"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s28"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s65"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s92"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s34"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s43"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s97"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s98"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s36"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s60"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s87"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s58"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s79"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s55"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s90"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s64"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s24"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s99"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s75"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s59"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s62"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s96"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s72"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s31"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s66"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s88"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s70"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s37"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s46"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s67"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s69"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s50"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s78"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s27"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s29"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s91"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s20"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s71"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s45"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s77"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s56"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s35"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s39"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s44"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s85"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s49"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s61"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s81"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s52"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s53"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s73"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s89"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s41"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s42"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s40"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s32"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s80"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s82"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s30"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s57"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s51"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s93"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s38"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s21"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s94"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s84"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s47"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s54"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s22"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s23"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s01"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s33"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s63"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s86"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s02"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s16"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s48"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s25"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s74"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s76"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s68"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
--- PASS: TestServers_RebalanceServers (1.63s)
=== RUN   TestServers_RebalanceServers_AvoidFailed
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
--- PASS: TestServers_RebalanceServers_AvoidFailed (0.01s)
=== RUN   TestManager_RemoveServer
2019/12/30 18:51:02 [DEBUG] manager: Rebalanced 19 servers, next active server is s16 (Addr: /) (DC: )
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s17"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s11"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s05"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s12"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s04"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s03"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s14"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s07"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s19"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s15"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s10"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s08"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s18"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s06"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s13"
2019/12/30 18:51:02 [DEBUG] manager: cycled away from server "s09"
--- PASS: TestManager_RemoveServer (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/router	1.774s
=== RUN   TestStructs_ACLCaches
=== PAUSE TestStructs_ACLCaches
=== RUN   TestStructs_ACL_IsSame
--- PASS: TestStructs_ACL_IsSame (0.00s)
=== RUN   TestStructs_ACL_Convert
=== PAUSE TestStructs_ACL_Convert
=== RUN   TestStructs_ACLToken_Convert
=== PAUSE TestStructs_ACLToken_Convert
=== RUN   TestStructs_ACLToken_PolicyIDs
=== PAUSE TestStructs_ACLToken_PolicyIDs
=== RUN   TestStructs_ACLToken_EmbeddedPolicy
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy
=== RUN   TestStructs_ACLServiceIdentity_SyntheticPolicy
=== PAUSE TestStructs_ACLServiceIdentity_SyntheticPolicy
=== RUN   TestStructs_ACLToken_SetHash
=== PAUSE TestStructs_ACLToken_SetHash
=== RUN   TestStructs_ACLToken_EstimateSize
=== PAUSE TestStructs_ACLToken_EstimateSize
=== RUN   TestStructs_ACLToken_Stub
=== PAUSE TestStructs_ACLToken_Stub
=== RUN   TestStructs_ACLTokens_Sort
=== PAUSE TestStructs_ACLTokens_Sort
=== RUN   TestStructs_ACLTokenListStubs_Sort
=== PAUSE TestStructs_ACLTokenListStubs_Sort
=== RUN   TestStructs_ACLPolicy_Stub
=== PAUSE TestStructs_ACLPolicy_Stub
=== RUN   TestStructs_ACLPolicy_SetHash
=== PAUSE TestStructs_ACLPolicy_SetHash
=== RUN   TestStructs_ACLPolicy_EstimateSize
=== PAUSE TestStructs_ACLPolicy_EstimateSize
=== RUN   TestStructs_ACLPolicies_Sort
=== PAUSE TestStructs_ACLPolicies_Sort
=== RUN   TestStructs_ACLPolicyListStubs_Sort
=== PAUSE TestStructs_ACLPolicyListStubs_Sort
=== RUN   TestStructs_ACLPolicies_resolveWithCache
=== PAUSE TestStructs_ACLPolicies_resolveWithCache
=== RUN   TestStructs_ACLPolicies_Compile
=== PAUSE TestStructs_ACLPolicies_Compile
=== RUN   TestCheckDefinition_Defaults
=== PAUSE TestCheckDefinition_Defaults
=== RUN   TestCheckDefinition_CheckType
=== PAUSE TestCheckDefinition_CheckType
=== RUN   TestCheckDefinitionToCheckType
=== PAUSE TestCheckDefinitionToCheckType
=== RUN   TestDecodeConfigEntry
=== PAUSE TestDecodeConfigEntry
=== RUN   TestServiceConfigResponse_MsgPack
--- PASS: TestServiceConfigResponse_MsgPack (0.00s)
=== RUN   TestConfigEntryResponseMarshalling
=== PAUSE TestConfigEntryResponseMarshalling
=== RUN   TestCAConfiguration_GetCommonConfig
=== RUN   TestCAConfiguration_GetCommonConfig/basic_defaults
=== RUN   TestCAConfiguration_GetCommonConfig/basic_defaults_after_encoding_fun
--- PASS: TestCAConfiguration_GetCommonConfig (0.00s)
    --- PASS: TestCAConfiguration_GetCommonConfig/basic_defaults (0.00s)
    --- PASS: TestCAConfiguration_GetCommonConfig/basic_defaults_after_encoding_fun (0.00s)
=== RUN   TestConnectProxyConfig_ToAPI
=== RUN   TestConnectProxyConfig_ToAPI/service
--- PASS: TestConnectProxyConfig_ToAPI (0.00s)
    --- PASS: TestConnectProxyConfig_ToAPI/service (0.00s)
=== RUN   TestUpstream_MarshalJSON
=== RUN   TestUpstream_MarshalJSON/service
=== RUN   TestUpstream_MarshalJSON/pq
--- PASS: TestUpstream_MarshalJSON (0.00s)
    --- PASS: TestUpstream_MarshalJSON/service (0.00s)
    --- PASS: TestUpstream_MarshalJSON/pq (0.00s)
=== RUN   TestUpstream_UnmarshalJSON
=== RUN   TestUpstream_UnmarshalJSON/service
=== RUN   TestUpstream_UnmarshalJSON/pq
--- PASS: TestUpstream_UnmarshalJSON (0.00s)
    --- PASS: TestUpstream_UnmarshalJSON/service (0.00s)
    --- PASS: TestUpstream_UnmarshalJSON/pq (0.00s)
=== RUN   TestConnectManagedProxy_ParseConfig
=== RUN   TestConnectManagedProxy_ParseConfig/empty
=== RUN   TestConnectManagedProxy_ParseConfig/specified
=== RUN   TestConnectManagedProxy_ParseConfig/stringy_port
=== RUN   TestConnectManagedProxy_ParseConfig/empty_addr
=== RUN   TestConnectManagedProxy_ParseConfig/empty_port
=== RUN   TestConnectManagedProxy_ParseConfig/junk_address
=== RUN   TestConnectManagedProxy_ParseConfig/zero_port,_missing_addr
=== RUN   TestConnectManagedProxy_ParseConfig/extra_fields_present
--- PASS: TestConnectManagedProxy_ParseConfig (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/specified (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/stringy_port (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty_addr (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty_port (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/junk_address (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/zero_port,_missing_addr (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/extra_fields_present (0.00s)
=== RUN   TestIntentionGetACLPrefix
=== RUN   TestIntentionGetACLPrefix/unset_name
=== RUN   TestIntentionGetACLPrefix/set_name
--- PASS: TestIntentionGetACLPrefix (0.00s)
    --- PASS: TestIntentionGetACLPrefix/unset_name (0.00s)
    --- PASS: TestIntentionGetACLPrefix/set_name (0.00s)
=== RUN   TestIntentionValidate
=== RUN   TestIntentionValidate/long_description
=== RUN   TestIntentionValidate/no_action_set
=== RUN   TestIntentionValidate/no_SourceNS
=== RUN   TestIntentionValidate/no_SourceName
=== RUN   TestIntentionValidate/no_DestinationNS
=== RUN   TestIntentionValidate/no_DestinationName
=== RUN   TestIntentionValidate/SourceNS_partial_wildcard
=== RUN   TestIntentionValidate/SourceName_partial_wildcard
=== RUN   TestIntentionValidate/SourceName_exact_following_wildcard
=== RUN   TestIntentionValidate/DestinationNS_partial_wildcard
=== RUN   TestIntentionValidate/DestinationName_partial_wildcard
=== RUN   TestIntentionValidate/DestinationName_exact_following_wildcard
=== RUN   TestIntentionValidate/SourceType_is_not_set
=== RUN   TestIntentionValidate/SourceType_is_other
--- PASS: TestIntentionValidate (0.01s)
    --- PASS: TestIntentionValidate/long_description (0.00s)
    --- PASS: TestIntentionValidate/no_action_set (0.00s)
    --- PASS: TestIntentionValidate/no_SourceNS (0.00s)
    --- PASS: TestIntentionValidate/no_SourceName (0.00s)
    --- PASS: TestIntentionValidate/no_DestinationNS (0.00s)
    --- PASS: TestIntentionValidate/no_DestinationName (0.00s)
    --- PASS: TestIntentionValidate/SourceNS_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceName_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceName_exact_following_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationNS_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationName_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationName_exact_following_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceType_is_not_set (0.00s)
    --- PASS: TestIntentionValidate/SourceType_is_other (0.00s)
=== RUN   TestIntentionPrecedenceSorter
=== RUN   TestIntentionPrecedenceSorter/exhaustive_list
=== RUN   TestIntentionPrecedenceSorter/tiebreak_deterministically
--- PASS: TestIntentionPrecedenceSorter (0.00s)
    --- PASS: TestIntentionPrecedenceSorter/exhaustive_list (0.00s)
    --- PASS: TestIntentionPrecedenceSorter/tiebreak_deterministically (0.00s)
=== RUN   TestStructs_PreparedQuery_GetACLPrefix
--- PASS: TestStructs_PreparedQuery_GetACLPrefix (0.00s)
=== RUN   TestAgentStructs_CheckTypes
=== PAUSE TestAgentStructs_CheckTypes
=== RUN   TestServiceDefinitionValidate
=== RUN   TestServiceDefinitionValidate/valid
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_a_port_set
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_no_port_set
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_native_set
--- PASS: TestServiceDefinitionValidate (0.00s)
    --- PASS: TestServiceDefinitionValidate/valid (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_a_port_set (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_no_port_set (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_native_set (0.00s)
=== RUN   TestServiceDefinitionConnectProxy_json
=== RUN   TestServiceDefinitionConnectProxy_json/no_config
=== RUN   TestServiceDefinitionConnectProxy_json/basic_config
=== RUN   TestServiceDefinitionConnectProxy_json/config_with_upstreams
--- PASS: TestServiceDefinitionConnectProxy_json (0.00s)
    --- PASS: TestServiceDefinitionConnectProxy_json/no_config (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
    --- PASS: TestServiceDefinitionConnectProxy_json/basic_config (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
    --- PASS: TestServiceDefinitionConnectProxy_json/config_with_upstreams (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
=== RUN   TestStructs_FilterFieldConfigurations
=== PAUSE TestStructs_FilterFieldConfigurations
=== RUN   TestEncodeDecode
--- PASS: TestEncodeDecode (0.00s)
=== RUN   TestStructs_Implements
--- PASS: TestStructs_Implements (0.00s)
=== RUN   TestStructs_RegisterRequest_ChangesNode
--- PASS: TestStructs_RegisterRequest_ChangesNode (0.00s)
=== RUN   TestNode_IsSame
--- PASS: TestNode_IsSame (0.00s)
=== RUN   TestStructs_ServiceNode_IsSameService
--- PASS: TestStructs_ServiceNode_IsSameService (0.00s)
=== RUN   TestStructs_ServiceNode_PartialClone
--- PASS: TestStructs_ServiceNode_PartialClone (0.00s)
=== RUN   TestStructs_ServiceNode_Conversions
--- PASS: TestStructs_ServiceNode_Conversions (0.00s)
=== RUN   TestStructs_NodeService_ValidateConnectProxy
=== RUN   TestStructs_NodeService_ValidateConnectProxy/valid
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_whitespace_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_valid_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_port_set
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_ConnectNative_set
--- PASS: TestStructs_NodeService_ValidateConnectProxy (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/valid (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_whitespace_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_valid_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_port_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_ConnectNative_set (0.00s)
=== RUN   TestStructs_NodeService_ValidateSidecarService
=== RUN   TestStructs_NodeService_ValidateSidecarService/valid
=== RUN   TestStructs_NodeService_ValidateSidecarService/ID_can't_be_set
=== RUN   TestStructs_NodeService_ValidateSidecarService/Nested_sidecar_can't_be_set
=== RUN   TestStructs_NodeService_ValidateSidecarService/Sidecar_can't_have_managed_proxy
--- PASS: TestStructs_NodeService_ValidateSidecarService (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/valid (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/ID_can't_be_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/Nested_sidecar_can't_be_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/Sidecar_can't_have_managed_proxy (0.00s)
=== RUN   TestStructs_NodeService_IsSame
--- PASS: TestStructs_NodeService_IsSame (0.00s)
=== RUN   TestStructs_HealthCheck_IsSame
--- PASS: TestStructs_HealthCheck_IsSame (0.00s)
=== RUN   TestStructs_HealthCheck_Marshalling
--- PASS: TestStructs_HealthCheck_Marshalling (0.00s)
=== RUN   TestStructs_HealthCheck_Clone
--- PASS: TestStructs_HealthCheck_Clone (0.00s)
=== RUN   TestStructs_CheckServiceNodes_Shuffle
--- PASS: TestStructs_CheckServiceNodes_Shuffle (0.01s)
=== RUN   TestStructs_CheckServiceNodes_Filter
--- PASS: TestStructs_CheckServiceNodes_Filter (0.00s)
=== RUN   TestStructs_DirEntry_Clone
--- PASS: TestStructs_DirEntry_Clone (0.00s)
=== RUN   TestStructs_ValidateMetadata
--- PASS: TestStructs_ValidateMetadata (0.00s)
=== RUN   TestStructs_validateMetaPair
--- PASS: TestStructs_validateMetaPair (0.00s)
=== RUN   TestSpecificServiceRequest_CacheInfo
=== RUN   TestSpecificServiceRequest_CacheInfo/basic_params
=== RUN   TestSpecificServiceRequest_CacheInfo/name_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/node_meta_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/address_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/tag_filter_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/connect_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/tags_should_be_different
=== RUN   TestSpecificServiceRequest_CacheInfo/tags_should_not_depend_on_order
=== RUN   TestSpecificServiceRequest_CacheInfo/legacy_requests_with_singular_tag_should_be_different
--- PASS: TestSpecificServiceRequest_CacheInfo (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/basic_params (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/name_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/node_meta_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/address_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tag_filter_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/connect_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tags_should_be_different (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tags_should_not_depend_on_order (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/legacy_requests_with_singular_tag_should_be_different (0.00s)
=== CONT  TestStructs_ACLCaches
=== CONT  TestStructs_ACLPolicy_EstimateSize
=== RUN   TestStructs_ACLCaches/New
--- PASS: TestStructs_ACLPolicy_EstimateSize (0.00s)
=== CONT  TestStructs_ACLPolicy_SetHash
=== CONT  TestStructs_ACLTokens_Sort
=== PAUSE TestStructs_ACLCaches/New
--- PASS: TestStructs_ACLTokens_Sort (0.00s)
=== CONT  TestStructs_ACLToken_SetHash
=== RUN   TestStructs_ACLCaches/Identities
=== RUN   TestStructs_ACLToken_SetHash/Nil_Hash_-_Generate
=== PAUSE TestStructs_ACLCaches/Identities
=== RUN   TestStructs_ACLCaches/Policies
=== PAUSE TestStructs_ACLCaches/Policies
=== RUN   TestStructs_ACLCaches/ParsedPolicies
=== PAUSE TestStructs_ACLCaches/ParsedPolicies
=== RUN   TestStructs_ACLCaches/Authorizers
=== PAUSE TestStructs_ACLCaches/Authorizers
=== RUN   TestStructs_ACLPolicy_SetHash/Nil_Hash_-_Generate
=== RUN   TestStructs_ACLPolicy_SetHash/Hash_Set_-_Dont_Generate
=== RUN   TestStructs_ACLPolicy_SetHash/Hash_Set_-_Generate
--- PASS: TestStructs_ACLPolicy_SetHash (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Nil_Hash_-_Generate (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Hash_Set_-_Dont_Generate (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Hash_Set_-_Generate (0.00s)
=== CONT  TestStructs_ACLServiceIdentity_SyntheticPolicy
=== RUN   TestStructs_ACLServiceIdentity_SyntheticPolicy/web
=== CONT  TestStructs_ACLToken_EstimateSize
=== RUN   TestStructs_ACLCaches/Roles
=== CONT  TestStructs_ACLToken_EmbeddedPolicy
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== PAUSE TestStructs_ACLCaches/Roles
--- PASS: TestStructs_ACLToken_EstimateSize (0.00s)
=== CONT  TestStructs_ACLToken_PolicyIDs
=== RUN   TestStructs_ACLToken_PolicyIDs/Basic
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== CONT  TestStructs_ACLToken_Convert
=== RUN   TestStructs_ACLToken_Convert/Management
=== PAUSE TestStructs_ACLToken_Convert/Management
=== RUN   TestStructs_ACLToken_Convert/Client
=== PAUSE TestStructs_ACLToken_Convert/Client
=== RUN   TestStructs_ACLToken_Convert/Unconvertible
=== PAUSE TestStructs_ACLToken_Convert/Unconvertible
=== CONT  TestStructs_ACL_Convert
=== PAUSE TestStructs_ACLToken_PolicyIDs/Basic
=== RUN   TestStructs_ACLToken_PolicyIDs/Legacy_Management
=== PAUSE TestStructs_ACLToken_PolicyIDs/Legacy_Management
--- PASS: TestStructs_ACL_Convert (0.00s)
=== CONT  TestStructs_ACLPolicy_Stub
=== RUN   TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== PAUSE TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
--- PASS: TestStructs_ACLPolicy_Stub (0.00s)
=== RUN   TestStructs_ACLToken_PolicyIDs/No_Policies
=== PAUSE TestStructs_ACLToken_PolicyIDs/No_Policies
=== CONT  TestStructs_ACLToken_Stub
=== RUN   TestStructs_ACLToken_Stub/Basic
=== PAUSE TestStructs_ACLToken_Stub/Basic
=== RUN   TestStructs_ACLToken_Stub/Legacy
=== PAUSE TestStructs_ACLToken_Stub/Legacy
=== CONT  TestStructs_FilterFieldConfigurations
=== RUN   TestStructs_FilterFieldConfigurations/Node
=== RUN   TestStructs_ACLServiceIdentity_SyntheticPolicy/companion-cube-99_[dc1,_dc2]
=== PAUSE TestStructs_FilterFieldConfigurations/Node
=== CONT  TestDecodeConfigEntry
=== RUN   TestStructs_FilterFieldConfigurations/NodeService
=== RUN   TestStructs_ACLToken_SetHash/Hash_Set_-_Dont_Generate
=== PAUSE TestStructs_FilterFieldConfigurations/NodeService
=== RUN   TestStructs_FilterFieldConfigurations/ServiceNode
=== PAUSE TestStructs_FilterFieldConfigurations/ServiceNode
=== RUN   TestStructs_ACLToken_SetHash/Hash_Set_-_Generate
=== RUN   TestDecodeConfigEntry/service-defaults
=== PAUSE TestDecodeConfigEntry/service-defaults
=== RUN   TestStructs_FilterFieldConfigurations/HealthCheck
=== RUN   TestDecodeConfigEntry/service-defaults_translations
=== PAUSE TestDecodeConfigEntry/service-defaults_translations
=== RUN   TestDecodeConfigEntry/proxy-defaults
=== PAUSE TestDecodeConfigEntry/proxy-defaults
=== RUN   TestDecodeConfigEntry/proxy-defaults_translations
=== PAUSE TestDecodeConfigEntry/proxy-defaults_translations
=== CONT  TestAgentStructs_CheckTypes
=== PAUSE TestStructs_FilterFieldConfigurations/HealthCheck
=== RUN   TestStructs_FilterFieldConfigurations/CheckServiceNode
--- PASS: TestAgentStructs_CheckTypes (0.00s)
=== CONT  TestStructs_ACLTokenListStubs_Sort
=== PAUSE TestStructs_FilterFieldConfigurations/CheckServiceNode
=== RUN   TestStructs_FilterFieldConfigurations/NodeInfo
=== PAUSE TestStructs_FilterFieldConfigurations/NodeInfo
=== RUN   TestStructs_FilterFieldConfigurations/api.AgentService
=== PAUSE TestStructs_FilterFieldConfigurations/api.AgentService
=== CONT  TestCheckDefinition_Defaults
--- PASS: TestStructs_ACLToken_SetHash (0.01s)
    --- PASS: TestStructs_ACLToken_SetHash/Nil_Hash_-_Generate (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Hash_Set_-_Dont_Generate (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Hash_Set_-_Generate (0.00s)
=== CONT  TestCheckDefinitionToCheckType
--- PASS: TestCheckDefinition_Defaults (0.00s)
=== CONT  TestCheckDefinition_CheckType
--- PASS: TestCheckDefinitionToCheckType (0.00s)
--- PASS: TestStructs_ACLTokenListStubs_Sort (0.00s)
=== CONT  TestStructs_ACLPolicies_resolveWithCache
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Cache_Misses
=== CONT  TestConfigEntryResponseMarshalling
=== RUN   TestConfigEntryResponseMarshalling/nil_entry
=== PAUSE TestConfigEntryResponseMarshalling/nil_entry
=== RUN   TestConfigEntryResponseMarshalling/proxy-default_entry
=== PAUSE TestConfigEntryResponseMarshalling/proxy-default_entry
=== RUN   TestConfigEntryResponseMarshalling/service-default_entry
=== PAUSE TestConfigEntryResponseMarshalling/service-default_entry
=== CONT  TestStructs_ACLPolicies_Compile
=== RUN   TestStructs_ACLPolicies_Compile/Cache_Miss
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Check_Cache
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Cache_Hits
--- PASS: TestStructs_ACLPolicies_resolveWithCache (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Cache_Misses (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Check_Cache (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Cache_Hits (0.00s)
=== CONT  TestStructs_ACLPolicyListStubs_Sort
=== RUN   TestStructs_ACLPolicies_Compile/Check_Cache
--- PASS: TestStructs_ACLPolicyListStubs_Sort (0.00s)
=== CONT  TestStructs_ACLPolicies_Sort
--- PASS: TestStructs_ACLPolicies_Sort (0.00s)
=== CONT  TestStructs_ACLCaches/New
=== RUN   TestStructs_ACLCaches/New/Valid_Sizes
=== PAUSE TestStructs_ACLCaches/New/Valid_Sizes
=== RUN   TestStructs_ACLCaches/New/Zero_Sizes
=== PAUSE TestStructs_ACLCaches/New/Zero_Sizes
=== CONT  TestStructs_ACLCaches/ParsedPolicies
=== RUN   TestStructs_ACLPolicies_Compile/Cache_Hit
=== CONT  TestStructs_ACLCaches/Roles
=== CONT  TestStructs_ACLCaches/Authorizers
--- PASS: TestStructs_ACLPolicies_Compile (0.01s)
    --- PASS: TestStructs_ACLPolicies_Compile/Cache_Miss (0.00s)
    --- PASS: TestStructs_ACLPolicies_Compile/Check_Cache (0.00s)
    --- PASS: TestStructs_ACLPolicies_Compile/Cache_Hit (0.00s)
=== CONT  TestStructs_ACLCaches/Identities
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== CONT  TestStructs_ACLToken_Convert/Management
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
--- PASS: TestStructs_ACLToken_EmbeddedPolicy (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/No_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client (0.00s)
=== CONT  TestStructs_ACLToken_Convert/Unconvertible
=== CONT  TestStructs_ACLToken_PolicyIDs/Basic
=== CONT  TestStructs_ACLToken_PolicyIDs/No_Policies
=== CONT  TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== CONT  TestStructs_ACLToken_PolicyIDs/Legacy_Management
--- PASS: TestStructs_ACLToken_PolicyIDs (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Basic (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/No_Policies (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Legacy_Management (0.00s)
=== CONT  TestStructs_ACLToken_Convert/Client
--- PASS: TestCheckDefinition_CheckType (0.01s)
=== CONT  TestStructs_ACLToken_Stub/Basic
--- PASS: TestStructs_ACLToken_Convert (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Management (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Unconvertible (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Client (0.00s)
=== CONT  TestDecodeConfigEntry/service-defaults
=== CONT  TestStructs_ACLToken_Stub/Legacy
=== CONT  TestStructs_ACLCaches/Policies
=== CONT  TestDecodeConfigEntry/proxy-defaults_translations
=== CONT  TestDecodeConfigEntry/proxy-defaults
=== CONT  TestDecodeConfigEntry/service-defaults_translations
=== CONT  TestStructs_FilterFieldConfigurations/Node
=== CONT  TestStructs_FilterFieldConfigurations/CheckServiceNode
=== CONT  TestStructs_FilterFieldConfigurations/api.AgentService
--- PASS: TestStructs_ACLServiceIdentity_SyntheticPolicy (0.02s)
    --- PASS: TestStructs_ACLServiceIdentity_SyntheticPolicy/web (0.00s)
    --- PASS: TestStructs_ACLServiceIdentity_SyntheticPolicy/companion-cube-99_[dc1,_dc2] (0.02s)
=== CONT  TestStructs_FilterFieldConfigurations/NodeInfo
=== CONT  TestStructs_FilterFieldConfigurations/ServiceNode
--- PASS: TestStructs_ACLToken_Stub (0.00s)
    --- PASS: TestStructs_ACLToken_Stub/Basic (0.00s)
    --- PASS: TestStructs_ACLToken_Stub/Legacy (0.00s)
--- PASS: TestDecodeConfigEntry (0.00s)
    --- PASS: TestDecodeConfigEntry/proxy-defaults (0.00s)
    --- PASS: TestDecodeConfigEntry/proxy-defaults_translations (0.00s)
    --- PASS: TestDecodeConfigEntry/service-defaults_translations (0.00s)
    --- PASS: TestDecodeConfigEntry/service-defaults (0.01s)
=== CONT  TestStructs_FilterFieldConfigurations/HealthCheck
=== CONT  TestStructs_FilterFieldConfigurations/NodeService
=== CONT  TestConfigEntryResponseMarshalling/nil_entry
=== CONT  TestStructs_ACLCaches/New/Valid_Sizes
=== CONT  TestConfigEntryResponseMarshalling/proxy-default_entry
=== CONT  TestStructs_ACLCaches/New/Zero_Sizes
--- PASS: TestStructs_ACLCaches (0.00s)
    --- PASS: TestStructs_ACLCaches/ParsedPolicies (0.00s)
    --- PASS: TestStructs_ACLCaches/Roles (0.00s)
    --- PASS: TestStructs_ACLCaches/Identities (0.00s)
    --- PASS: TestStructs_ACLCaches/Authorizers (0.00s)
    --- PASS: TestStructs_ACLCaches/Policies (0.00s)
    --- PASS: TestStructs_ACLCaches/New (0.00s)
        --- PASS: TestStructs_ACLCaches/New/Valid_Sizes (0.00s)
        --- PASS: TestStructs_ACLCaches/New/Zero_Sizes (0.00s)
=== CONT  TestConfigEntryResponseMarshalling/service-default_entry
--- PASS: TestConfigEntryResponseMarshalling (0.00s)
    --- PASS: TestConfigEntryResponseMarshalling/nil_entry (0.00s)
    --- PASS: TestConfigEntryResponseMarshalling/proxy-default_entry (0.00s)
    --- PASS: TestConfigEntryResponseMarshalling/service-default_entry (0.00s)
--- PASS: TestStructs_FilterFieldConfigurations (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/Node (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/api.AgentService (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/HealthCheck (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/CheckServiceNode (0.01s)
    --- PASS: TestStructs_FilterFieldConfigurations/ServiceNode (0.01s)
    --- PASS: TestStructs_FilterFieldConfigurations/NodeInfo (0.01s)
    --- PASS: TestStructs_FilterFieldConfigurations/NodeService (0.01s)
PASS
ok  	github.com/hashicorp/consul/agent/structs	0.193s
?   	github.com/hashicorp/consul/agent/systemd	[no test files]
=== RUN   TestStore_RegularTokens
=== PAUSE TestStore_RegularTokens
=== RUN   TestStore_AgentMasterToken
=== PAUSE TestStore_AgentMasterToken
=== CONT  TestStore_RegularTokens
=== CONT  TestStore_AgentMasterToken
--- PASS: TestStore_AgentMasterToken (0.00s)
=== RUN   TestStore_RegularTokens/set_user_-_config
=== PAUSE TestStore_RegularTokens/set_user_-_config
=== RUN   TestStore_RegularTokens/set_user_-_api
=== PAUSE TestStore_RegularTokens/set_user_-_api
=== RUN   TestStore_RegularTokens/set_agent_-_config
=== PAUSE TestStore_RegularTokens/set_agent_-_config
=== RUN   TestStore_RegularTokens/set_agent_-_api
=== PAUSE TestStore_RegularTokens/set_agent_-_api
=== RUN   TestStore_RegularTokens/set_user_and_agent
=== PAUSE TestStore_RegularTokens/set_user_and_agent
=== RUN   TestStore_RegularTokens/set_repl_-_config
=== PAUSE TestStore_RegularTokens/set_repl_-_config
=== RUN   TestStore_RegularTokens/set_repl_-_api
=== PAUSE TestStore_RegularTokens/set_repl_-_api
=== RUN   TestStore_RegularTokens/set_master_-_config
=== PAUSE TestStore_RegularTokens/set_master_-_config
=== RUN   TestStore_RegularTokens/set_master_-_api
=== PAUSE TestStore_RegularTokens/set_master_-_api
=== RUN   TestStore_RegularTokens/set_all
=== PAUSE TestStore_RegularTokens/set_all
=== CONT  TestStore_RegularTokens/set_user_-_config
=== CONT  TestStore_RegularTokens/set_repl_-_config
=== CONT  TestStore_RegularTokens/set_master_-_config
=== CONT  TestStore_RegularTokens/set_repl_-_api
=== CONT  TestStore_RegularTokens/set_agent_-_config
=== CONT  TestStore_RegularTokens/set_user_-_api
=== CONT  TestStore_RegularTokens/set_user_and_agent
=== CONT  TestStore_RegularTokens/set_all
=== CONT  TestStore_RegularTokens/set_agent_-_api
=== CONT  TestStore_RegularTokens/set_master_-_api
--- PASS: TestStore_RegularTokens (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_repl_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_master_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_repl_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_agent_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_and_agent (0.00s)
    --- PASS: TestStore_RegularTokens/set_all (0.00s)
    --- PASS: TestStore_RegularTokens/set_agent_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_master_-_api (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/token	0.044s
=== RUN   TestClustersFromSnapshot
=== RUN   TestClustersFromSnapshot/defaults
=== RUN   TestClustersFromSnapshot/custom-local-app
=== RUN   TestClustersFromSnapshot/custom-local-app-typed
=== RUN   TestClustersFromSnapshot/custom-upstream
=== RUN   TestClustersFromSnapshot/custom-upstream-typed
=== RUN   TestClustersFromSnapshot/custom-upstream-ignores-tls
=== RUN   TestClustersFromSnapshot/custom-timeouts
--- PASS: TestClustersFromSnapshot (0.54s)
    --- PASS: TestClustersFromSnapshot/defaults (0.09s)
    --- PASS: TestClustersFromSnapshot/custom-local-app (0.06s)
    --- PASS: TestClustersFromSnapshot/custom-local-app-typed (0.05s)
    --- PASS: TestClustersFromSnapshot/custom-upstream (0.08s)
    --- PASS: TestClustersFromSnapshot/custom-upstream-typed (0.05s)
    --- PASS: TestClustersFromSnapshot/custom-upstream-ignores-tls (0.04s)
    --- PASS: TestClustersFromSnapshot/custom-timeouts (0.18s)
=== RUN   TestParseProxyConfig
=== RUN   TestParseProxyConfig/defaults_-_nil
=== RUN   TestParseProxyConfig/defaults_-_empty
=== RUN   TestParseProxyConfig/defaults_-_other_stuff
=== RUN   TestParseProxyConfig/protocol_override
=== RUN   TestParseProxyConfig/protocol_uppercase_override
=== RUN   TestParseProxyConfig/local_connect_timeout_override,_string
=== RUN   TestParseProxyConfig/local_connect_timeout_override,_float_
=== RUN   TestParseProxyConfig/local_connect_timeout_override,_int_
--- PASS: TestParseProxyConfig (0.01s)
    --- PASS: TestParseProxyConfig/defaults_-_nil (0.00s)
    --- PASS: TestParseProxyConfig/defaults_-_empty (0.00s)
    --- PASS: TestParseProxyConfig/defaults_-_other_stuff (0.00s)
    --- PASS: TestParseProxyConfig/protocol_override (0.00s)
    --- PASS: TestParseProxyConfig/protocol_uppercase_override (0.01s)
    --- PASS: TestParseProxyConfig/local_connect_timeout_override,_string (0.00s)
    --- PASS: TestParseProxyConfig/local_connect_timeout_override,_float_ (0.00s)
    --- PASS: TestParseProxyConfig/local_connect_timeout_override,_int_ (0.00s)
=== RUN   TestParseUpstreamConfig
=== RUN   TestParseUpstreamConfig/defaults_-_nil
=== RUN   TestParseUpstreamConfig/defaults_-_empty
=== RUN   TestParseUpstreamConfig/defaults_-_other_stuff
=== RUN   TestParseUpstreamConfig/protocol_override
=== RUN   TestParseUpstreamConfig/connect_timeout_override,_string
=== RUN   TestParseUpstreamConfig/connect_timeout_override,_float_
=== RUN   TestParseUpstreamConfig/connect_timeout_override,_int_
--- PASS: TestParseUpstreamConfig (0.00s)
    --- PASS: TestParseUpstreamConfig/defaults_-_nil (0.00s)
    --- PASS: TestParseUpstreamConfig/defaults_-_empty (0.00s)
    --- PASS: TestParseUpstreamConfig/defaults_-_other_stuff (0.00s)
    --- PASS: TestParseUpstreamConfig/protocol_override (0.00s)
    --- PASS: TestParseUpstreamConfig/connect_timeout_override,_string (0.00s)
    --- PASS: TestParseUpstreamConfig/connect_timeout_override,_float_ (0.00s)
    --- PASS: TestParseUpstreamConfig/connect_timeout_override,_int_ (0.00s)
=== RUN   Test_makeLoadAssignment
=== RUN   Test_makeLoadAssignment/no_instances
=== RUN   Test_makeLoadAssignment/instances,_no_weights
=== RUN   Test_makeLoadAssignment/instances,_healthy_weights
=== RUN   Test_makeLoadAssignment/instances,_warning_weights
--- PASS: Test_makeLoadAssignment (0.02s)
    --- PASS: Test_makeLoadAssignment/no_instances (0.00s)
    --- PASS: Test_makeLoadAssignment/instances,_no_weights (0.00s)
    --- PASS: Test_makeLoadAssignment/instances,_healthy_weights (0.00s)
    --- PASS: Test_makeLoadAssignment/instances,_warning_weights (0.00s)
=== RUN   TestListenersFromSnapshot
=== RUN   TestListenersFromSnapshot/defaults
=== RUN   TestListenersFromSnapshot/http-public-listener
=== RUN   TestListenersFromSnapshot/http-upstream
=== RUN   TestListenersFromSnapshot/custom-public-listener
=== RUN   TestListenersFromSnapshot/custom-public-listener-typed
=== RUN   TestListenersFromSnapshot/custom-public-listener-ignores-tls
=== RUN   TestListenersFromSnapshot/custom-upstream
=== RUN   TestListenersFromSnapshot/custom-upstream-typed
--- PASS: TestListenersFromSnapshot (0.62s)
    --- PASS: TestListenersFromSnapshot/defaults (0.17s)
    --- PASS: TestListenersFromSnapshot/http-public-listener (0.10s)
    --- PASS: TestListenersFromSnapshot/http-upstream (0.08s)
    --- PASS: TestListenersFromSnapshot/custom-public-listener (0.10s)
    --- PASS: TestListenersFromSnapshot/custom-public-listener-typed (0.04s)
    --- PASS: TestListenersFromSnapshot/custom-public-listener-ignores-tls (0.05s)
    --- PASS: TestListenersFromSnapshot/custom-upstream (0.05s)
    --- PASS: TestListenersFromSnapshot/custom-upstream-typed (0.04s)
=== RUN   TestServer_StreamAggregatedResources_BasicProtocol
--- PASS: TestServer_StreamAggregatedResources_BasicProtocol (0.18s)
=== RUN   TestServer_StreamAggregatedResources_ACLEnforcement
--- SKIP: TestServer_StreamAggregatedResources_ACLEnforcement (0.00s)
    server_test.go:347: DM-skipped
=== RUN   TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedDuringDiscoveryRequest
2019/12/30 18:51:42 [DEBUG] Error handling ADS stream: rpc error: code = Unauthenticated desc = unauthenticated: ACL not found
--- PASS: TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedDuringDiscoveryRequest (0.06s)
=== RUN   TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedInBackground
2019/12/30 18:51:42 [DEBUG] Error handling ADS stream: rpc error: code = Unauthenticated desc = unauthenticated: ACL not found
--- PASS: TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedInBackground (0.16s)
=== RUN   TestServer_Check
=== RUN   TestServer_Check/auth_allowed
2019/12/30 18:51:42 [DEBUG] grpc: Connect AuthZ ALLOWED: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db reason=default allow
=== RUN   TestServer_Check/auth_denied
2019/12/30 18:51:42 [DEBUG] grpc: Connect AuthZ DENIED: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db reason=default deny
=== RUN   TestServer_Check/no_source
=== RUN   TestServer_Check/no_dest
=== RUN   TestServer_Check/dest_invalid_format
2019/12/30 18:51:42 [DEBUG] grpc: Connect AuthZ DENIED: bad destination URI: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=not-a-spiffe-id
=== RUN   TestServer_Check/dest_not_a_service_URI
2019/12/30 18:51:42 [DEBUG] grpc: Connect AuthZ DENIED: bad destination service ID: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://trust-domain.consul
=== RUN   TestServer_Check/ACL_not_got_permission_for_authz_call
2019/12/30 18:51:42 [DEBUG] grpc: Connect AuthZ failed ACL check: Permission denied: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db
=== RUN   TestServer_Check/Random_error_running_authz
2019/12/30 18:51:42 [DEBUG] grpc: Connect AuthZ failed: gremlin attack: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db
--- PASS: TestServer_Check (0.00s)
    --- PASS: TestServer_Check/auth_allowed (0.00s)
    --- PASS: TestServer_Check/auth_denied (0.00s)
    --- PASS: TestServer_Check/no_source (0.00s)
    --- PASS: TestServer_Check/no_dest (0.00s)
    --- PASS: TestServer_Check/dest_invalid_format (0.00s)
    --- PASS: TestServer_Check/dest_not_a_service_URI (0.00s)
    --- PASS: TestServer_Check/ACL_not_got_permission_for_authz_call (0.00s)
    --- PASS: TestServer_Check/Random_error_running_authz (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/xds	1.788s
?   	github.com/hashicorp/consul/command	[no test files]
?   	github.com/hashicorp/consul/command/acl	[no test files]
=== RUN   TestAgentTokensCommand_noTabs
=== PAUSE TestAgentTokensCommand_noTabs
=== RUN   TestAgentTokensCommand
=== PAUSE TestAgentTokensCommand
=== CONT  TestAgentTokensCommand_noTabs
=== CONT  TestAgentTokensCommand
--- PASS: TestAgentTokensCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgentTokensCommand - 2019/12/30 18:51:54.358240 [WARN] agent: Node name "Node 43e42d1b-ad87-0612-0317-810e9086a10a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentTokensCommand - 2019/12/30 18:51:54.359356 [DEBUG] tlsutil: Update with version 1
TestAgentTokensCommand - 2019/12/30 18:51:54.371032 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:51:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:43e42d1b-ad87-0612-0317-810e9086a10a Address:127.0.0.1:22006}]
2019/12/30 18:51:55 [INFO]  raft: Node at 127.0.0.1:22006 [Follower] entering Follower state (Leader: "")
TestAgentTokensCommand - 2019/12/30 18:51:55.794721 [INFO] serf: EventMemberJoin: Node 43e42d1b-ad87-0612-0317-810e9086a10a.dc1 127.0.0.1
TestAgentTokensCommand - 2019/12/30 18:51:55.798337 [INFO] serf: EventMemberJoin: Node 43e42d1b-ad87-0612-0317-810e9086a10a 127.0.0.1
TestAgentTokensCommand - 2019/12/30 18:51:55.800013 [INFO] consul: Adding LAN server Node 43e42d1b-ad87-0612-0317-810e9086a10a (Addr: tcp/127.0.0.1:22006) (DC: dc1)
TestAgentTokensCommand - 2019/12/30 18:51:55.800015 [INFO] consul: Handled member-join event for server "Node 43e42d1b-ad87-0612-0317-810e9086a10a.dc1" in area "wan"
TestAgentTokensCommand - 2019/12/30 18:51:55.800971 [INFO] agent: Started DNS server 127.0.0.1:22001 (udp)
TestAgentTokensCommand - 2019/12/30 18:51:55.801055 [INFO] agent: Started DNS server 127.0.0.1:22001 (tcp)
TestAgentTokensCommand - 2019/12/30 18:51:55.803638 [INFO] agent: Started HTTP server on 127.0.0.1:22002 (tcp)
TestAgentTokensCommand - 2019/12/30 18:51:55.803824 [INFO] agent: started state syncer
2019/12/30 18:51:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:51:55 [INFO]  raft: Node at 127.0.0.1:22006 [Candidate] entering Candidate state in term 2
2019/12/30 18:51:56 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:51:56 [INFO]  raft: Node at 127.0.0.1:22006 [Leader] entering Leader state
TestAgentTokensCommand - 2019/12/30 18:51:56.912903 [INFO] consul: cluster leadership acquired
TestAgentTokensCommand - 2019/12/30 18:51:56.913477 [INFO] consul: New leader elected: Node 43e42d1b-ad87-0612-0317-810e9086a10a
TestAgentTokensCommand - 2019/12/30 18:51:56.943400 [ERR] agent: failed to sync remote state: ACL not found
TestAgentTokensCommand - 2019/12/30 18:51:57.352096 [INFO] acl: initializing acls
TestAgentTokensCommand - 2019/12/30 18:51:57.972947 [INFO] consul: Created ACL 'global-management' policy
TestAgentTokensCommand - 2019/12/30 18:51:57.973254 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentTokensCommand - 2019/12/30 18:51:57.974747 [INFO] acl: initializing acls
TestAgentTokensCommand - 2019/12/30 18:51:57.974888 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentTokensCommand - 2019/12/30 18:51:58.372853 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentTokensCommand - 2019/12/30 18:51:58.855640 [INFO] consul: Created ACL anonymous token from configuration
TestAgentTokensCommand - 2019/12/30 18:51:58.855765 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentTokensCommand - 2019/12/30 18:51:58.856552 [INFO] serf: EventMemberUpdate: Node 43e42d1b-ad87-0612-0317-810e9086a10a
TestAgentTokensCommand - 2019/12/30 18:51:58.857175 [INFO] serf: EventMemberUpdate: Node 43e42d1b-ad87-0612-0317-810e9086a10a.dc1
TestAgentTokensCommand - 2019/12/30 18:51:58.857308 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentTokensCommand - 2019/12/30 18:51:58.860454 [INFO] serf: EventMemberUpdate: Node 43e42d1b-ad87-0612-0317-810e9086a10a
TestAgentTokensCommand - 2019/12/30 18:51:58.862395 [INFO] serf: EventMemberUpdate: Node 43e42d1b-ad87-0612-0317-810e9086a10a.dc1
TestAgentTokensCommand - 2019/12/30 18:52:00.914627 [INFO] agent: Synced node info
TestAgentTokensCommand - 2019/12/30 18:52:00.914736 [DEBUG] agent: Node info in sync
TestAgentTokensCommand - 2019/12/30 18:52:01.187491 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentTokensCommand - 2019/12/30 18:52:01.905655 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentTokensCommand - 2019/12/30 18:52:01.906094 [DEBUG] consul: Skipping self join check for "Node 43e42d1b-ad87-0612-0317-810e9086a10a" since the cluster is too small
TestAgentTokensCommand - 2019/12/30 18:52:01.906250 [INFO] consul: member 'Node 43e42d1b-ad87-0612-0317-810e9086a10a' joined, marking health alive
TestAgentTokensCommand - 2019/12/30 18:52:01.909243 [DEBUG] http: Request PUT /v1/acl/token (966.727327ms) from=127.0.0.1:32996
TestAgentTokensCommand - 2019/12/30 18:52:01.916597 [INFO] agent: Updated agent's ACL token "default"
TestAgentTokensCommand - 2019/12/30 18:52:01.916702 [DEBUG] http: Request PUT /v1/agent/token/default (790.354µs) from=127.0.0.1:32998
TestAgentTokensCommand - 2019/12/30 18:52:01.922386 [INFO] agent: Updated agent's ACL token "agent"
TestAgentTokensCommand - 2019/12/30 18:52:01.922485 [DEBUG] http: Request PUT /v1/agent/token/agent (699.018µs) from=127.0.0.1:33000
TestAgentTokensCommand - 2019/12/30 18:52:01.926085 [INFO] agent: Updated agent's ACL token "agent_master"
TestAgentTokensCommand - 2019/12/30 18:52:01.926198 [DEBUG] http: Request PUT /v1/agent/token/agent_master (795.687µs) from=127.0.0.1:33002
TestAgentTokensCommand - 2019/12/30 18:52:01.929028 [INFO] agent: Updated agent's ACL token "replication"
TestAgentTokensCommand - 2019/12/30 18:52:01.929121 [DEBUG] http: Request PUT /v1/agent/token/replication (611.349µs) from=127.0.0.1:33004
TestAgentTokensCommand - 2019/12/30 18:52:01.930158 [INFO] agent: Requesting shutdown
TestAgentTokensCommand - 2019/12/30 18:52:01.930276 [INFO] consul: shutting down server
TestAgentTokensCommand - 2019/12/30 18:52:01.930364 [WARN] serf: Shutdown without a Leave
TestAgentTokensCommand - 2019/12/30 18:52:02.081395 [WARN] serf: Shutdown without a Leave
TestAgentTokensCommand - 2019/12/30 18:52:02.153926 [INFO] manager: shutting down
TestAgentTokensCommand - 2019/12/30 18:52:02.154168 [ERR] consul: failed to reconcile member: {Node 43e42d1b-ad87-0612-0317-810e9086a10a 127.0.0.1 22004 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:43e42d1b-ad87-0612-0317-810e9086a10a port:22006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:22005] alive 1 5 2 2 5 4}: leadership lost while committing log
TestAgentTokensCommand - 2019/12/30 18:52:02.154636 [INFO] agent: consul server down
TestAgentTokensCommand - 2019/12/30 18:52:02.154695 [INFO] agent: shutdown complete
TestAgentTokensCommand - 2019/12/30 18:52:02.154717 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestAgentTokensCommand - 2019/12/30 18:52:02.154924 [ERR] consul: failed to reconcile member: {Node 43e42d1b-ad87-0612-0317-810e9086a10a 127.0.0.1 22004 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:43e42d1b-ad87-0612-0317-810e9086a10a port:22006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:22005] alive 1 5 2 2 5 4}: raft is already shutdown
TestAgentTokensCommand - 2019/12/30 18:52:02.154749 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (tcp)
TestAgentTokensCommand - 2019/12/30 18:52:02.155197 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (udp)
TestAgentTokensCommand - 2019/12/30 18:52:02.155359 [INFO] agent: Stopping HTTP server 127.0.0.1:22002 (tcp)
TestAgentTokensCommand - 2019/12/30 18:52:02.155448 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestAgentTokensCommand - 2019/12/30 18:52:02.155659 [ERR] consul: failed to reconcile member: {Node 43e42d1b-ad87-0612-0317-810e9086a10a 127.0.0.1 22004 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:43e42d1b-ad87-0612-0317-810e9086a10a port:22006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:22005] alive 1 5 2 2 5 4}: raft is already shutdown
TestAgentTokensCommand - 2019/12/30 18:52:02.156428 [INFO] agent: Waiting for endpoints to shut down
TestAgentTokensCommand - 2019/12/30 18:52:02.156620 [INFO] agent: Endpoints down
--- PASS: TestAgentTokensCommand (7.88s)
PASS
ok  	github.com/hashicorp/consul/command/acl/agenttokens	8.218s
?   	github.com/hashicorp/consul/command/acl/authmethod	[no test files]
=== RUN   TestAuthMethodCreateCommand_noTabs
=== PAUSE TestAuthMethodCreateCommand_noTabs
=== RUN   TestAuthMethodCreateCommand
=== PAUSE TestAuthMethodCreateCommand
=== RUN   TestAuthMethodCreateCommand_k8s
=== PAUSE TestAuthMethodCreateCommand_k8s
=== CONT  TestAuthMethodCreateCommand_noTabs
=== CONT  TestAuthMethodCreateCommand_k8s
=== CONT  TestAuthMethodCreateCommand
--- PASS: TestAuthMethodCreateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodCreateCommand - 2019/12/30 18:52:09.005624 [WARN] agent: Node name "Node c7d97541-d537-58f1-8fee-d0eb29efba4c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodCreateCommand - 2019/12/30 18:52:09.006658 [DEBUG] tlsutil: Update with version 1
TestAuthMethodCreateCommand - 2019/12/30 18:52:09.013131 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:09.022701 [WARN] agent: Node name "Node 4d042857-7759-a707-f823-181f593d0182" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:09.023296 [DEBUG] tlsutil: Update with version 1
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:09.027585 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c7d97541-d537-58f1-8fee-d0eb29efba4c Address:127.0.0.1:38506}]
2019/12/30 18:52:10 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
2019/12/30 18:52:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4d042857-7759-a707-f823-181f593d0182 Address:127.0.0.1:38512}]
2019/12/30 18:52:10 [INFO]  raft: Node at 127.0.0.1:38512 [Follower] entering Follower state (Leader: "")
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.226719 [INFO] serf: EventMemberJoin: Node c7d97541-d537-58f1-8fee-d0eb29efba4c.dc1 127.0.0.1
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.232379 [INFO] serf: EventMemberJoin: Node 4d042857-7759-a707-f823-181f593d0182.dc1 127.0.0.1
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.232952 [INFO] serf: EventMemberJoin: Node c7d97541-d537-58f1-8fee-d0eb29efba4c 127.0.0.1
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.237473 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.245087 [INFO] consul: Handled member-join event for server "Node c7d97541-d537-58f1-8fee-d0eb29efba4c.dc1" in area "wan"
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.248603 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.251940 [INFO] consul: Adding LAN server Node c7d97541-d537-58f1-8fee-d0eb29efba4c (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.253599 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestAuthMethodCreateCommand - 2019/12/30 18:52:10.253885 [INFO] agent: started state syncer
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.255669 [INFO] serf: EventMemberJoin: Node 4d042857-7759-a707-f823-181f593d0182 127.0.0.1
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.256683 [INFO] consul: Adding LAN server Node 4d042857-7759-a707-f823-181f593d0182 (Addr: tcp/127.0.0.1:38512) (DC: dc1)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.256764 [INFO] consul: Handled member-join event for server "Node 4d042857-7759-a707-f823-181f593d0182.dc1" in area "wan"
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.257315 [INFO] agent: Started DNS server 127.0.0.1:38507 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.257395 [INFO] agent: Started DNS server 127.0.0.1:38507 (udp)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.260117 [INFO] agent: Started HTTP server on 127.0.0.1:38508 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:10.260274 [INFO] agent: started state syncer
2019/12/30 18:52:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:10 [INFO]  raft: Node at 127.0.0.1:38512 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:10 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:11 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
2019/12/30 18:52:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:11 [INFO]  raft: Node at 127.0.0.1:38512 [Leader] entering Leader state
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.004676 [INFO] consul: cluster leadership acquired
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.005337 [INFO] consul: New leader elected: Node c7d97541-d537-58f1-8fee-d0eb29efba4c
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.005676 [INFO] consul: cluster leadership acquired
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.006105 [INFO] consul: New leader elected: Node 4d042857-7759-a707-f823-181f593d0182
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.006346 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.010173 [INFO] acl: initializing acls
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.054811 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.462750 [INFO] acl: initializing acls
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.546239 [INFO] acl: initializing acls
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.546699 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.546771 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.661681 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.661785 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.771654 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:11.771756 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.797597 [INFO] acl: initializing acls
TestAuthMethodCreateCommand - 2019/12/30 18:52:11.800169 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand - 2019/12/30 18:52:12.716856 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand - 2019/12/30 18:52:12.718486 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:12.721039 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:12.723541 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.060090 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.060207 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.061070 [INFO] serf: EventMemberUpdate: Node 4d042857-7759-a707-f823-181f593d0182
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.061690 [INFO] serf: EventMemberUpdate: Node 4d042857-7759-a707-f823-181f593d0182.dc1
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.063249 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.063360 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.064213 [INFO] serf: EventMemberUpdate: Node c7d97541-d537-58f1-8fee-d0eb29efba4c
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.067949 [INFO] serf: EventMemberUpdate: Node c7d97541-d537-58f1-8fee-d0eb29efba4c.dc1
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.256864 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.257706 [INFO] serf: EventMemberUpdate: Node 4d042857-7759-a707-f823-181f593d0182
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:13.258462 [INFO] serf: EventMemberUpdate: Node 4d042857-7759-a707-f823-181f593d0182.dc1
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.468368 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.468801 [INFO] agent: Synced node info
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.468888 [DEBUG] agent: Node info in sync
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.469192 [INFO] serf: EventMemberUpdate: Node c7d97541-d537-58f1-8fee-d0eb29efba4c
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.469930 [INFO] serf: EventMemberUpdate: Node c7d97541-d537-58f1-8fee-d0eb29efba4c.dc1
=== RUN   TestAuthMethodCreateCommand/type_required
=== RUN   TestAuthMethodCreateCommand/name_required
=== RUN   TestAuthMethodCreateCommand/invalid_type
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.496451 [ERR] http: Request PUT /v1/acl/auth-method, error: Invalid Auth Method: Type should be one of: [kubernetes testing] from=127.0.0.1:47102
TestAuthMethodCreateCommand - 2019/12/30 18:52:13.507315 [DEBUG] http: Request PUT /v1/acl/auth-method (13.202684ms) from=127.0.0.1:47102
=== RUN   TestAuthMethodCreateCommand/create_testing
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:14.547401 [INFO] agent: Synced node info
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:14.553588 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodCreateCommand_k8s/k8s_host_required
=== RUN   TestAuthMethodCreateCommand_k8s/k8s_ca_cert_required
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.560591 [DEBUG] http: Request PUT /v1/acl/auth-method (1.042981342s) from=127.0.0.1:47104
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.563366 [INFO] agent: Requesting shutdown
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.563457 [INFO] consul: shutting down server
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.563509 [WARN] serf: Shutdown without a Leave
=== RUN   TestAuthMethodCreateCommand_k8s/k8s_jwt_required
=== RUN   TestAuthMethodCreateCommand_k8s/create_k8s
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.731228 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.829316 [INFO] manager: shutting down
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.829940 [INFO] agent: consul server down
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.830007 [INFO] agent: shutdown complete
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.830067 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.830213 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.830385 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.831111 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.831177 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodCreateCommand (5.92s)
    --- PASS: TestAuthMethodCreateCommand/type_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand/name_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand/invalid_type (0.03s)
    --- PASS: TestAuthMethodCreateCommand/create_testing (1.05s)
TestAuthMethodCreateCommand - 2019/12/30 18:52:14.842091 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:15.348090 [DEBUG] http: Request PUT /v1/acl/auth-method (748.340856ms) from=127.0.0.1:42664
=== RUN   TestAuthMethodCreateCommand_k8s/create_k8s_with_cert_file
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:15.364318 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:15.365306 [DEBUG] consul: Skipping self join check for "Node 4d042857-7759-a707-f823-181f593d0182" since the cluster is too small
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:15.365730 [INFO] consul: member 'Node 4d042857-7759-a707-f823-181f593d0182' joined, marking health alive
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:15.673068 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.655905 [DEBUG] consul: Skipping self join check for "Node 4d042857-7759-a707-f823-181f593d0182" since the cluster is too small
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.656500 [DEBUG] consul: Skipping self join check for "Node 4d042857-7759-a707-f823-181f593d0182" since the cluster is too small
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.658694 [DEBUG] http: Request PUT /v1/acl/auth-method (1.28383073s) from=127.0.0.1:42666
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.665103 [INFO] agent: Requesting shutdown
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.665223 [INFO] consul: shutting down server
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.665276 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.804149 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.879677 [INFO] manager: shutting down
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.881681 [INFO] agent: consul server down
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.881796 [INFO] agent: shutdown complete
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.881896 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.882121 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (udp)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.882432 [INFO] agent: Stopping HTTP server 127.0.0.1:38508 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.883594 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodCreateCommand_k8s - 2019/12/30 18:52:16.883733 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodCreateCommand_k8s (7.97s)
    --- PASS: TestAuthMethodCreateCommand_k8s/k8s_host_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand_k8s/k8s_ca_cert_required (0.01s)
    --- PASS: TestAuthMethodCreateCommand_k8s/k8s_jwt_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand_k8s/create_k8s (0.76s)
    --- PASS: TestAuthMethodCreateCommand_k8s/create_k8s_with_cert_file (1.31s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/create	8.218s
=== RUN   TestAuthMethodDeleteCommand_noTabs
=== PAUSE TestAuthMethodDeleteCommand_noTabs
=== RUN   TestAuthMethodDeleteCommand
=== PAUSE TestAuthMethodDeleteCommand
=== CONT  TestAuthMethodDeleteCommand_noTabs
=== CONT  TestAuthMethodDeleteCommand
--- PASS: TestAuthMethodDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodDeleteCommand - 2019/12/30 18:52:23.699262 [WARN] agent: Node name "Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodDeleteCommand - 2019/12/30 18:52:23.700481 [DEBUG] tlsutil: Update with version 1
TestAuthMethodDeleteCommand - 2019/12/30 18:52:23.710012 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:46b0d4c3-0115-c834-ca2c-9b363b3ec07c Address:127.0.0.1:53506}]
2019/12/30 18:52:24 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.779147 [INFO] serf: EventMemberJoin: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c.dc1 127.0.0.1
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.783198 [INFO] serf: EventMemberJoin: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c 127.0.0.1
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.784528 [INFO] consul: Handled member-join event for server "Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c.dc1" in area "wan"
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.785264 [INFO] consul: Adding LAN server Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.786037 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.786609 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.789494 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:24.789905 [INFO] agent: started state syncer
2019/12/30 18:52:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:24 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:25 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:25 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.380591 [INFO] consul: cluster leadership acquired
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.381105 [INFO] consul: New leader elected: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.536066 [INFO] acl: initializing acls
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.730989 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.854977 [INFO] acl: initializing acls
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.855715 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.855857 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodDeleteCommand - 2019/12/30 18:52:25.976921 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodDeleteCommand - 2019/12/30 18:52:26.155832 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodDeleteCommand - 2019/12/30 18:52:26.157003 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodDeleteCommand - 2019/12/30 18:52:26.864682 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodDeleteCommand - 2019/12/30 18:52:26.864733 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.109924 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.110811 [INFO] serf: EventMemberUpdate: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.111366 [INFO] serf: EventMemberUpdate: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c.dc1
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.482307 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.482377 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.483145 [INFO] serf: EventMemberUpdate: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c
TestAuthMethodDeleteCommand - 2019/12/30 18:52:27.483781 [INFO] serf: EventMemberUpdate: Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c.dc1
TestAuthMethodDeleteCommand - 2019/12/30 18:52:28.748068 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodDeleteCommand - 2019/12/30 18:52:28.748571 [DEBUG] consul: Skipping self join check for "Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c" since the cluster is too small
TestAuthMethodDeleteCommand - 2019/12/30 18:52:28.748678 [INFO] consul: member 'Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c' joined, marking health alive
TestAuthMethodDeleteCommand - 2019/12/30 18:52:28.968343 [DEBUG] consul: Skipping self join check for "Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c" since the cluster is too small
TestAuthMethodDeleteCommand - 2019/12/30 18:52:28.968904 [DEBUG] consul: Skipping self join check for "Node 46b0d4c3-0115-c834-ca2c-9b363b3ec07c" since the cluster is too small
=== RUN   TestAuthMethodDeleteCommand/name_required
=== RUN   TestAuthMethodDeleteCommand/delete_notfound
TestAuthMethodDeleteCommand - 2019/12/30 18:52:28.983567 [DEBUG] http: Request DELETE /v1/acl/auth-method/notfound (3.637096ms) from=127.0.0.1:37870
=== RUN   TestAuthMethodDeleteCommand/delete_works
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.303787 [DEBUG] http: Request PUT /v1/acl/auth-method (316.364391ms) from=127.0.0.1:37872
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.622822 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.623200 [DEBUG] http: Request DELETE /v1/acl/auth-method/test-eace2be4-1e27-2f83-9a62-08632c6dd7ca (310.858578ms) from=127.0.0.1:37874
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.625881 [DEBUG] http: Request GET /v1/acl/auth-method/test-eace2be4-1e27-2f83-9a62-08632c6dd7ca (408.011µs) from=127.0.0.1:37872
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.627666 [INFO] agent: Requesting shutdown
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.627736 [INFO] consul: shutting down server
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.627780 [WARN] serf: Shutdown without a Leave
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.730136 [WARN] serf: Shutdown without a Leave
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.879554 [INFO] manager: shutting down
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.880335 [INFO] agent: consul server down
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.880384 [INFO] agent: shutdown complete
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.880475 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.880647 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.880840 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.881357 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodDeleteCommand - 2019/12/30 18:52:29.881474 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodDeleteCommand (6.38s)
    --- PASS: TestAuthMethodDeleteCommand/name_required (0.00s)
    --- PASS: TestAuthMethodDeleteCommand/delete_notfound (0.01s)
    --- PASS: TestAuthMethodDeleteCommand/delete_works (0.64s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/delete	6.913s
=== RUN   TestAuthMethodListCommand_noTabs
=== PAUSE TestAuthMethodListCommand_noTabs
=== RUN   TestAuthMethodListCommand
=== PAUSE TestAuthMethodListCommand
=== CONT  TestAuthMethodListCommand_noTabs
=== CONT  TestAuthMethodListCommand
--- PASS: TestAuthMethodListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodListCommand - 2019/12/30 18:52:44.782333 [WARN] agent: Node name "Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodListCommand - 2019/12/30 18:52:44.783072 [DEBUG] tlsutil: Update with version 1
TestAuthMethodListCommand - 2019/12/30 18:52:44.789606 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:52:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:61e6204f-a2b5-fbc1-c586-befc7c9b5196 Address:127.0.0.1:20506}]
2019/12/30 18:52:46 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestAuthMethodListCommand - 2019/12/30 18:52:46.011465 [INFO] serf: EventMemberJoin: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196.dc1 127.0.0.1
TestAuthMethodListCommand - 2019/12/30 18:52:46.019085 [INFO] serf: EventMemberJoin: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196 127.0.0.1
TestAuthMethodListCommand - 2019/12/30 18:52:46.020190 [INFO] consul: Adding LAN server Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196 (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestAuthMethodListCommand - 2019/12/30 18:52:46.020808 [INFO] consul: Handled member-join event for server "Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196.dc1" in area "wan"
TestAuthMethodListCommand - 2019/12/30 18:52:46.027193 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestAuthMethodListCommand - 2019/12/30 18:52:46.027764 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestAuthMethodListCommand - 2019/12/30 18:52:46.030649 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestAuthMethodListCommand - 2019/12/30 18:52:46.030798 [INFO] agent: started state syncer
2019/12/30 18:52:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:52:46 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/30 18:52:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:52:46 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestAuthMethodListCommand - 2019/12/30 18:52:46.665536 [INFO] consul: cluster leadership acquired
TestAuthMethodListCommand - 2019/12/30 18:52:46.666020 [INFO] consul: New leader elected: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196
TestAuthMethodListCommand - 2019/12/30 18:52:46.772469 [INFO] acl: initializing acls
TestAuthMethodListCommand - 2019/12/30 18:52:46.803544 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodListCommand - 2019/12/30 18:52:47.014270 [INFO] acl: initializing acls
TestAuthMethodListCommand - 2019/12/30 18:52:47.016919 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodListCommand - 2019/12/30 18:52:47.017033 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodListCommand - 2019/12/30 18:52:47.331028 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodListCommand - 2019/12/30 18:52:47.331538 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodListCommand - 2019/12/30 18:52:47.332020 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodListCommand - 2019/12/30 18:52:47.466974 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodListCommand - 2019/12/30 18:52:48.073043 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodListCommand - 2019/12/30 18:52:48.073182 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodListCommand - 2019/12/30 18:52:48.074263 [INFO] serf: EventMemberUpdate: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196
TestAuthMethodListCommand - 2019/12/30 18:52:48.073079 [INFO] consul: Created ACL master token from configuration
TestAuthMethodListCommand - 2019/12/30 18:52:48.075591 [INFO] serf: EventMemberUpdate: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196
TestAuthMethodListCommand - 2019/12/30 18:52:48.076843 [INFO] serf: EventMemberUpdate: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196.dc1
TestAuthMethodListCommand - 2019/12/30 18:52:48.078960 [INFO] serf: EventMemberUpdate: Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196.dc1
TestAuthMethodListCommand - 2019/12/30 18:52:49.755884 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodListCommand - 2019/12/30 18:52:49.756453 [DEBUG] consul: Skipping self join check for "Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196" since the cluster is too small
TestAuthMethodListCommand - 2019/12/30 18:52:49.756576 [INFO] consul: member 'Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196' joined, marking health alive
TestAuthMethodListCommand - 2019/12/30 18:52:50.016098 [DEBUG] consul: Skipping self join check for "Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196" since the cluster is too small
TestAuthMethodListCommand - 2019/12/30 18:52:50.016648 [DEBUG] consul: Skipping self join check for "Node 61e6204f-a2b5-fbc1-c586-befc7c9b5196" since the cluster is too small
=== RUN   TestAuthMethodListCommand/found_none
TestAuthMethodListCommand - 2019/12/30 18:52:50.046629 [DEBUG] http: Request GET /v1/acl/auth-methods (4.404116ms) from=127.0.0.1:32876
TestAuthMethodListCommand - 2019/12/30 18:52:50.283977 [DEBUG] http: Request PUT /v1/acl/auth-method (232.971176ms) from=127.0.0.1:32878
TestAuthMethodListCommand - 2019/12/30 18:52:50.285771 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodListCommand - 2019/12/30 18:52:50.465081 [DEBUG] http: Request PUT /v1/acl/auth-method (178.052387ms) from=127.0.0.1:32878
TestAuthMethodListCommand - 2019/12/30 18:52:50.732170 [DEBUG] http: Request PUT /v1/acl/auth-method (250.691978ms) from=127.0.0.1:32878
TestAuthMethodListCommand - 2019/12/30 18:52:51.021252 [DEBUG] http: Request PUT /v1/acl/auth-method (282.726161ms) from=127.0.0.1:32878
TestAuthMethodListCommand - 2019/12/30 18:52:51.223880 [DEBUG] http: Request PUT /v1/acl/auth-method (195.679521ms) from=127.0.0.1:32878
=== RUN   TestAuthMethodListCommand/found_some
TestAuthMethodListCommand - 2019/12/30 18:52:51.239867 [DEBUG] http: Request GET /v1/acl/auth-methods (5.484145ms) from=127.0.0.1:32880
TestAuthMethodListCommand - 2019/12/30 18:52:51.244000 [INFO] agent: Requesting shutdown
TestAuthMethodListCommand - 2019/12/30 18:52:51.244092 [INFO] consul: shutting down server
TestAuthMethodListCommand - 2019/12/30 18:52:51.244145 [WARN] serf: Shutdown without a Leave
TestAuthMethodListCommand - 2019/12/30 18:52:52.571569 [WARN] serf: Shutdown without a Leave
TestAuthMethodListCommand - 2019/12/30 18:52:52.705017 [INFO] manager: shutting down
TestAuthMethodListCommand - 2019/12/30 18:52:52.705746 [INFO] agent: consul server down
TestAuthMethodListCommand - 2019/12/30 18:52:52.705806 [INFO] agent: shutdown complete
TestAuthMethodListCommand - 2019/12/30 18:52:52.705862 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestAuthMethodListCommand - 2019/12/30 18:52:52.706015 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestAuthMethodListCommand - 2019/12/30 18:52:52.706179 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestAuthMethodListCommand - 2019/12/30 18:52:52.706831 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodListCommand - 2019/12/30 18:52:52.707019 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodListCommand (8.00s)
    --- PASS: TestAuthMethodListCommand/found_none (0.01s)
    --- PASS: TestAuthMethodListCommand/found_some (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/list	8.255s
=== RUN   TestAuthMethodReadCommand_noTabs
=== PAUSE TestAuthMethodReadCommand_noTabs
=== RUN   TestAuthMethodReadCommand
=== PAUSE TestAuthMethodReadCommand
=== CONT  TestAuthMethodReadCommand_noTabs
=== CONT  TestAuthMethodReadCommand
--- PASS: TestAuthMethodReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodReadCommand - 2019/12/30 18:52:59.612355 [WARN] agent: Node name "Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodReadCommand - 2019/12/30 18:52:59.613326 [DEBUG] tlsutil: Update with version 1
TestAuthMethodReadCommand - 2019/12/30 18:52:59.635247 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:198c9cdd-18ea-312c-6f4b-5b0e9543ad96 Address:127.0.0.1:49006}]
2019/12/30 18:53:01 [INFO]  raft: Node at 127.0.0.1:49006 [Follower] entering Follower state (Leader: "")
TestAuthMethodReadCommand - 2019/12/30 18:53:01.120945 [INFO] serf: EventMemberJoin: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96.dc1 127.0.0.1
TestAuthMethodReadCommand - 2019/12/30 18:53:01.145189 [INFO] serf: EventMemberJoin: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96 127.0.0.1
TestAuthMethodReadCommand - 2019/12/30 18:53:01.146622 [INFO] consul: Adding LAN server Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96 (Addr: tcp/127.0.0.1:49006) (DC: dc1)
TestAuthMethodReadCommand - 2019/12/30 18:53:01.147051 [INFO] consul: Handled member-join event for server "Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96.dc1" in area "wan"
TestAuthMethodReadCommand - 2019/12/30 18:53:01.147500 [INFO] agent: Started DNS server 127.0.0.1:49001 (udp)
TestAuthMethodReadCommand - 2019/12/30 18:53:01.147571 [INFO] agent: Started DNS server 127.0.0.1:49001 (tcp)
2019/12/30 18:53:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:01 [INFO]  raft: Node at 127.0.0.1:49006 [Candidate] entering Candidate state in term 2
TestAuthMethodReadCommand - 2019/12/30 18:53:01.165735 [INFO] agent: Started HTTP server on 127.0.0.1:49002 (tcp)
TestAuthMethodReadCommand - 2019/12/30 18:53:01.166188 [INFO] agent: started state syncer
2019/12/30 18:53:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:01 [INFO]  raft: Node at 127.0.0.1:49006 [Leader] entering Leader state
TestAuthMethodReadCommand - 2019/12/30 18:53:01.705708 [INFO] consul: cluster leadership acquired
TestAuthMethodReadCommand - 2019/12/30 18:53:01.706429 [INFO] consul: New leader elected: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96
TestAuthMethodReadCommand - 2019/12/30 18:53:01.905490 [INFO] acl: initializing acls
TestAuthMethodReadCommand - 2019/12/30 18:53:01.914176 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodReadCommand - 2019/12/30 18:53:02.497724 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodReadCommand - 2019/12/30 18:53:02.497825 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodReadCommand - 2019/12/30 18:53:02.499827 [INFO] acl: initializing acls
TestAuthMethodReadCommand - 2019/12/30 18:53:02.499954 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodReadCommand - 2019/12/30 18:53:03.181545 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodReadCommand - 2019/12/30 18:53:03.181699 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodReadCommand - 2019/12/30 18:53:03.790576 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodReadCommand - 2019/12/30 18:53:03.791578 [INFO] serf: EventMemberUpdate: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96
TestAuthMethodReadCommand - 2019/12/30 18:53:03.791894 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodReadCommand - 2019/12/30 18:53:03.791965 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodReadCommand - 2019/12/30 18:53:03.792314 [INFO] serf: EventMemberUpdate: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96.dc1
TestAuthMethodReadCommand - 2019/12/30 18:53:03.793461 [INFO] serf: EventMemberUpdate: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96
TestAuthMethodReadCommand - 2019/12/30 18:53:03.794248 [INFO] serf: EventMemberUpdate: Node 198c9cdd-18ea-312c-6f4b-5b0e9543ad96.dc1
TestAuthMethodReadCommand - 2019/12/30 18:53:04.333951 [INFO] agent: Synced node info
TestAuthMethodReadCommand - 2019/12/30 18:53:04.334074 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodReadCommand/name_required
=== RUN   TestAuthMethodReadCommand/not_found
TestAuthMethodReadCommand - 2019/12/30 18:53:04.373061 [DEBUG] http: Request GET /v1/acl/auth-method/notfound (4.216778ms) from=127.0.0.1:48654
=== RUN   TestAuthMethodReadCommand/read_by_name
TestAuthMethodReadCommand - 2019/12/30 18:53:04.983622 [DEBUG] http: Request PUT /v1/acl/auth-method (589.897632ms) from=127.0.0.1:48656
TestAuthMethodReadCommand - 2019/12/30 18:53:05.003743 [DEBUG] http: Request GET /v1/acl/auth-method/test-da5bd396-fa7d-250b-a970-cef9ffb23bcd (2.476065ms) from=127.0.0.1:48658
TestAuthMethodReadCommand - 2019/12/30 18:53:05.006993 [INFO] agent: Requesting shutdown
TestAuthMethodReadCommand - 2019/12/30 18:53:05.007142 [INFO] consul: shutting down server
TestAuthMethodReadCommand - 2019/12/30 18:53:05.007195 [WARN] serf: Shutdown without a Leave
TestAuthMethodReadCommand - 2019/12/30 18:53:05.180241 [WARN] serf: Shutdown without a Leave
TestAuthMethodReadCommand - 2019/12/30 18:53:05.472094 [INFO] manager: shutting down
TestAuthMethodReadCommand - 2019/12/30 18:53:05.839256 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestAuthMethodReadCommand - 2019/12/30 18:53:05.839576 [INFO] agent: consul server down
TestAuthMethodReadCommand - 2019/12/30 18:53:05.839634 [INFO] agent: shutdown complete
TestAuthMethodReadCommand - 2019/12/30 18:53:05.841732 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (tcp)
TestAuthMethodReadCommand - 2019/12/30 18:53:05.841925 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (udp)
TestAuthMethodReadCommand - 2019/12/30 18:53:05.842074 [INFO] agent: Stopping HTTP server 127.0.0.1:49002 (tcp)
TestAuthMethodReadCommand - 2019/12/30 18:53:05.846212 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodReadCommand - 2019/12/30 18:53:05.846326 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodReadCommand (6.33s)
    --- PASS: TestAuthMethodReadCommand/name_required (0.01s)
    --- PASS: TestAuthMethodReadCommand/not_found (0.02s)
    --- PASS: TestAuthMethodReadCommand/read_by_name (0.62s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/read	6.624s
=== RUN   TestAuthMethodUpdateCommand_noTabs
=== PAUSE TestAuthMethodUpdateCommand_noTabs
=== RUN   TestAuthMethodUpdateCommand
=== PAUSE TestAuthMethodUpdateCommand
=== RUN   TestAuthMethodUpdateCommand_noMerge
=== PAUSE TestAuthMethodUpdateCommand_noMerge
=== RUN   TestAuthMethodUpdateCommand_k8s
=== PAUSE TestAuthMethodUpdateCommand_k8s
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge
=== PAUSE TestAuthMethodUpdateCommand_k8s_noMerge
=== CONT  TestAuthMethodUpdateCommand_noTabs
--- PASS: TestAuthMethodUpdateCommand_noTabs (0.01s)
=== CONT  TestAuthMethodUpdateCommand_k8s_noMerge
=== CONT  TestAuthMethodUpdateCommand_noMerge
=== CONT  TestAuthMethodUpdateCommand_k8s
=== CONT  TestAuthMethodUpdateCommand
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:18.185082 [WARN] agent: Node name "Node fa3a4e18-08eb-62c2-393a-08b584f0097a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:18.186248 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:18.296650 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:18.300408 [WARN] agent: Node name "Node e4811a21-26e7-bf4e-019a-c668de8a5372" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:18.348778 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:18.351225 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand - 2019/12/30 18:53:18.376867 [WARN] agent: Node name "Node 15226548-6ba7-e101-7d6b-02a53fdb170c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand - 2019/12/30 18:53:18.377621 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:18.379091 [WARN] agent: Node name "Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand - 2019/12/30 18:53:18.379893 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:18.386226 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:18.388633 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fa3a4e18-08eb-62c2-393a-08b584f0097a Address:127.0.0.1:23512}]
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23512 [Follower] entering Follower state (Leader: "")
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.431740 [INFO] serf: EventMemberJoin: Node fa3a4e18-08eb-62c2-393a-08b584f0097a.dc1 127.0.0.1
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.440800 [INFO] serf: EventMemberJoin: Node fa3a4e18-08eb-62c2-393a-08b584f0097a 127.0.0.1
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.443778 [INFO] agent: Started DNS server 127.0.0.1:23507 (udp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.448473 [INFO] consul: Adding LAN server Node fa3a4e18-08eb-62c2-393a-08b584f0097a (Addr: tcp/127.0.0.1:23512) (DC: dc1)
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.455192 [INFO] agent: Started DNS server 127.0.0.1:23507 (tcp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.459556 [INFO] agent: Started HTTP server on 127.0.0.1:23508 (tcp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.459742 [INFO] agent: started state syncer
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:19.460247 [INFO] consul: Handled member-join event for server "Node fa3a4e18-08eb-62c2-393a-08b584f0097a.dc1" in area "wan"
2019/12/30 18:53:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23512 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e4811a21-26e7-bf4e-019a-c668de8a5372 Address:127.0.0.1:23506}]
2019/12/30 18:53:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:15226548-6ba7-e101-7d6b-02a53fdb170c Address:127.0.0.1:23524}]
2019/12/30 18:53:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7 Address:127.0.0.1:23518}]
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23518 [Follower] entering Follower state (Leader: "")
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.718801 [INFO] serf: EventMemberJoin: Node 15226548-6ba7-e101-7d6b-02a53fdb170c.dc1 127.0.0.1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.720013 [INFO] serf: EventMemberJoin: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7.dc1 127.0.0.1
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23524 [Follower] entering Follower state (Leader: "")
2019/12/30 18:53:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23524 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:19 [INFO]  raft: Node at 127.0.0.1:23518 [Candidate] entering Candidate state in term 2
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.791541 [INFO] serf: EventMemberJoin: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7 127.0.0.1
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.791804 [INFO] serf: EventMemberJoin: Node 15226548-6ba7-e101-7d6b-02a53fdb170c 127.0.0.1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.793089 [INFO] serf: EventMemberJoin: Node e4811a21-26e7-bf4e-019a-c668de8a5372.dc1 127.0.0.1
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.793557 [INFO] agent: Started DNS server 127.0.0.1:23519 (udp)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.793595 [INFO] agent: Started DNS server 127.0.0.1:23513 (udp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.794686 [INFO] consul: Adding LAN server Node 15226548-6ba7-e101-7d6b-02a53fdb170c (Addr: tcp/127.0.0.1:23524) (DC: dc1)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.794684 [INFO] consul: Adding LAN server Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7 (Addr: tcp/127.0.0.1:23518) (DC: dc1)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.794941 [INFO] consul: Handled member-join event for server "Node 15226548-6ba7-e101-7d6b-02a53fdb170c.dc1" in area "wan"
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.795454 [INFO] agent: Started DNS server 127.0.0.1:23519 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.795774 [INFO] consul: Handled member-join event for server "Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7.dc1" in area "wan"
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.796500 [INFO] agent: Started DNS server 127.0.0.1:23513 (tcp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.797739 [INFO] agent: Started HTTP server on 127.0.0.1:23520 (tcp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:19.797857 [INFO] agent: started state syncer
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.798845 [INFO] serf: EventMemberJoin: Node e4811a21-26e7-bf4e-019a-c668de8a5372 127.0.0.1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.839983 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.840668 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.842401 [INFO] consul: Adding LAN server Node e4811a21-26e7-bf4e-019a-c668de8a5372 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.842870 [INFO] agent: Started HTTP server on 127.0.0.1:23514 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:19.842983 [INFO] agent: started state syncer
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.842861 [INFO] consul: Handled member-join event for server "Node e4811a21-26e7-bf4e-019a-c668de8a5372.dc1" in area "wan"
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.855613 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:19.855761 [INFO] agent: started state syncer
2019/12/30 18:53:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:20 [INFO]  raft: Node at 127.0.0.1:23512 [Leader] entering Leader state
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:20.248683 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:20.249443 [INFO] consul: New leader elected: Node fa3a4e18-08eb-62c2-393a-08b584f0097a
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:20.340880 [ERR] agent: failed to sync remote state: ACL not found
2019/12/30 18:53:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:20 [INFO]  raft: Node at 127.0.0.1:23518 [Leader] entering Leader state
2019/12/30 18:53:20 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.590110 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.590595 [INFO] consul: New leader elected: Node e4811a21-26e7-bf4e-019a-c668de8a5372
2019/12/30 18:53:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:20 [INFO]  raft: Node at 127.0.0.1:23524 [Leader] entering Leader state
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:20.590846 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:20.591185 [INFO] consul: New leader elected: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7
TestAuthMethodUpdateCommand - 2019/12/30 18:53:20.591417 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand - 2019/12/30 18:53:20.591821 [INFO] consul: New leader elected: Node 15226548-6ba7-e101-7d6b-02a53fdb170c
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.594250 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.667203 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:20.680860 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:20.718046 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand - 2019/12/30 18:53:20.791763 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:20.866553 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:20.866645 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand - 2019/12/30 18:53:20.980879 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:20.980879 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.981809 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.982168 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:20.982236 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.002164 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.002308 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.135980 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:21.156898 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:21.157011 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.160305 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.160387 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.358785 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:21.358831 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.359057 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:21.359219 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.396145 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.399931 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.477329 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:21.507021 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.582490 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.584467 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.584595 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.710907 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:21.890081 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.891245 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.891371 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.891953 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.892274 [INFO] serf: EventMemberUpdate: Node fa3a4e18-08eb-62c2-393a-08b584f0097a
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.892950 [INFO] serf: EventMemberUpdate: Node fa3a4e18-08eb-62c2-393a-08b584f0097a
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.892958 [INFO] serf: EventMemberUpdate: Node fa3a4e18-08eb-62c2-393a-08b584f0097a.dc1
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:21.893739 [INFO] serf: EventMemberUpdate: Node fa3a4e18-08eb-62c2-393a-08b584f0097a.dc1
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.894539 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.899267 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.899370 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand - 2019/12/30 18:53:21.899649 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.900523 [INFO] serf: EventMemberUpdate: Node e4811a21-26e7-bf4e-019a-c668de8a5372
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:21.902620 [INFO] serf: EventMemberUpdate: Node e4811a21-26e7-bf4e-019a-c668de8a5372.dc1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:22.123485 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:22.124497 [INFO] serf: EventMemberUpdate: Node e4811a21-26e7-bf4e-019a-c668de8a5372
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.124785 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:22.125164 [INFO] serf: EventMemberUpdate: Node e4811a21-26e7-bf4e-019a-c668de8a5372.dc1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.125709 [INFO] serf: EventMemberUpdate: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.126382 [INFO] serf: EventMemberUpdate: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7.dc1
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.336406 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.337358 [INFO] serf: EventMemberUpdate: Node 15226548-6ba7-e101-7d6b-02a53fdb170c
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.338056 [INFO] serf: EventMemberUpdate: Node 15226548-6ba7-e101-7d6b-02a53fdb170c.dc1
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.339627 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.339703 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.340716 [INFO] serf: EventMemberUpdate: Node 15226548-6ba7-e101-7d6b-02a53fdb170c
TestAuthMethodUpdateCommand - 2019/12/30 18:53:22.341366 [INFO] serf: EventMemberUpdate: Node 15226548-6ba7-e101-7d6b-02a53fdb170c.dc1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.342562 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.342645 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.343437 [INFO] serf: EventMemberUpdate: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.344150 [INFO] serf: EventMemberUpdate: Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7.dc1
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.995674 [INFO] agent: Synced node info
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:22.995794 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:23.364865 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:23.365507 [DEBUG] consul: Skipping self join check for "Node fa3a4e18-08eb-62c2-393a-08b584f0097a" since the cluster is too small
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:23.365654 [INFO] consul: member 'Node fa3a4e18-08eb-62c2-393a-08b584f0097a' joined, marking health alive
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:23.675101 [DEBUG] consul: Skipping self join check for "Node fa3a4e18-08eb-62c2-393a-08b584f0097a" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:23.680666 [DEBUG] http: Request PUT /v1/acl/auth-method (595.809781ms) from=127.0.0.1:59934
=== RUN   TestAuthMethodUpdateCommand_noMerge/update_without_name
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:23.682935 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:23.683487 [DEBUG] consul: Skipping self join check for "Node e4811a21-26e7-bf4e-019a-c668de8a5372" since the cluster is too small
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:23.683584 [INFO] consul: member 'Node e4811a21-26e7-bf4e-019a-c668de8a5372' joined, marking health alive
=== RUN   TestAuthMethodUpdateCommand_noMerge/update_nonexistent_method
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:23.707371 [DEBUG] consul: Skipping self join check for "Node fa3a4e18-08eb-62c2-393a-08b584f0097a" since the cluster is too small
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:23.731744 [DEBUG] http: Request GET /v1/acl/auth-method/test (18.531825ms) from=127.0.0.1:58076
=== RUN   TestAuthMethodUpdateCommand_noMerge/update_all_fields
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:23.760973 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-7f520a5c-228d-b0be-3742-bf55d2516f03 (7.844875ms) from=127.0.0.1:59940
TestAuthMethodUpdateCommand - 2019/12/30 18:53:23.869232 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand - 2019/12/30 18:53:23.869788 [DEBUG] consul: Skipping self join check for "Node 15226548-6ba7-e101-7d6b-02a53fdb170c" since the cluster is too small
TestAuthMethodUpdateCommand - 2019/12/30 18:53:23.869890 [INFO] consul: member 'Node 15226548-6ba7-e101-7d6b-02a53fdb170c' joined, marking health alive
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.005959 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.007375 [DEBUG] http: Request PUT /v1/acl/auth-method (1.255865597s) from=127.0.0.1:58078
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.016305 [DEBUG] http: Request GET /v1/acl/auth-method/test-f78d5662-e049-6bdf-acb5-9d435b65a6ae (1.470372ms) from=127.0.0.1:58082
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.110882 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.114363 [DEBUG] consul: Skipping self join check for "Node e4811a21-26e7-bf4e-019a-c668de8a5372" since the cluster is too small
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.114922 [DEBUG] consul: Skipping self join check for "Node e4811a21-26e7-bf4e-019a-c668de8a5372" since the cluster is too small
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_host
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.375542 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.378513 [DEBUG] consul: Skipping self join check for "Node 15226548-6ba7-e101-7d6b-02a53fdb170c" since the cluster is too small
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.379010 [DEBUG] consul: Skipping self join check for "Node 15226548-6ba7-e101-7d6b-02a53fdb170c" since the cluster is too small
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.380924 [DEBUG] http: Request PUT /v1/acl/auth-method/test-f78d5662-e049-6bdf-acb5-9d435b65a6ae (362.069923ms) from=127.0.0.1:58082
=== RUN   TestAuthMethodUpdateCommand/update_without_name
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.384950 [DEBUG] http: Request GET /v1/acl/auth-method/test-f78d5662-e049-6bdf-acb5-9d435b65a6ae (828.688µs) from=127.0.0.1:58078
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.386698 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.386785 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.386848 [WARN] serf: Shutdown without a Leave
=== RUN   TestAuthMethodUpdateCommand/update_nonexistent_method
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.420241 [DEBUG] http: Request GET /v1/acl/auth-method/test (2.39273ms) from=127.0.0.1:50380
=== RUN   TestAuthMethodUpdateCommand/update_all_fields
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.530692 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:25.532024 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.532748 [DEBUG] http: Request PUT /v1/acl/auth-method (289.145325ms) from=127.0.0.1:37526
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:25.541826 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-7f520a5c-228d-b0be-3742-bf55d2516f03 (1.767027801s) from=127.0.0.1:59940
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:25.548266 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-7f520a5c-228d-b0be-3742-bf55d2516f03 (1.345369ms) from=127.0.0.1:59934
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.552271 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-995db682-860e-dbd0-2f74-1fc7e4297327 (4.177777ms) from=127.0.0.1:37532
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_ca_cert
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_with_cert_file
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.664147 [INFO] manager: shutting down
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.665154 [INFO] agent: consul server down
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.665245 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.665379 [INFO] agent: Stopping DNS server 127.0.0.1:23507 (tcp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.665627 [INFO] agent: Stopping DNS server 127.0.0.1:23507 (udp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.665853 [INFO] agent: Stopping HTTP server 127.0.0.1:23508 (tcp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.665639 [DEBUG] http: Request PUT /v1/acl/auth-method (242.173747ms) from=127.0.0.1:50382
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.667015 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand_noMerge - 2019/12/30 18:53:25.667072 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand_noMerge (7.77s)
    --- PASS: TestAuthMethodUpdateCommand_noMerge/update_without_name (0.01s)
    --- PASS: TestAuthMethodUpdateCommand_noMerge/update_nonexistent_method (0.05s)
    --- PASS: TestAuthMethodUpdateCommand_noMerge/update_all_fields (1.64s)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.675028 [DEBUG] http: Request GET /v1/acl/auth-method/test-fc76796a-6c61-208f-9143-1152603cb03e (1.370703ms) from=127.0.0.1:50386
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:25.923085 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.924379 [DEBUG] http: Request PUT /v1/acl/auth-method (364.266648ms) from=127.0.0.1:37526
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.924739 [DEBUG] http: Request PUT /v1/acl/auth-method/test-fc76796a-6c61-208f-9143-1152603cb03e (246.267189ms) from=127.0.0.1:50386
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.930234 [DEBUG] http: Request GET /v1/acl/auth-method/test-fc76796a-6c61-208f-9143-1152603cb03e (1.466039ms) from=127.0.0.1:50382
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.932240 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.932486 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand - 2019/12/30 18:53:25.932630 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:25.942226 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-59c33bf3-8a72-794e-1bf8-715181c97a50 (1.462705ms) from=127.0.0.1:37536
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_jwt
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.113986 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.114680 [DEBUG] consul: Skipping self join check for "Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.114879 [INFO] consul: member 'Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7' joined, marking health alive
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.115992 [DEBUG] http: Request PUT /v1/acl/auth-method (520.054774ms) from=127.0.0.1:59934
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.132164 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-27d50492-93b1-4c69-685f-10e55ee710d9 (1.348036ms) from=127.0.0.1:59956
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.239196 [INFO] manager: shutting down
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.240021 [INFO] agent: consul server down
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.240092 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.240153 [INFO] agent: Stopping DNS server 127.0.0.1:23519 (tcp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.240298 [INFO] agent: Stopping DNS server 127.0.0.1:23519 (udp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.240488 [INFO] agent: Stopping HTTP server 127.0.0.1:23520 (tcp)
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.241324 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand - 2019/12/30 18:53:26.241409 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand (8.35s)
    --- PASS: TestAuthMethodUpdateCommand/update_without_name (0.02s)
    --- PASS: TestAuthMethodUpdateCommand/update_nonexistent_method (0.02s)
    --- PASS: TestAuthMethodUpdateCommand/update_all_fields (0.51s)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:26.424776 [DEBUG] http: Request PUT /v1/acl/auth-method (476.687291ms) from=127.0.0.1:37526
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:26.447478 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-851b2bc9-c8e3-76cd-4b8e-7a35f53e032c (1.86805ms) from=127.0.0.1:37540
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:26.692589 [DEBUG] http: Request PUT /v1/acl/auth-method (238.292978ms) from=127.0.0.1:37526
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:26.702025 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-c635dd5b-9f79-32c1-679d-0efaa099c93a (1.300035ms) from=127.0.0.1:37542
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.831808 [DEBUG] consul: Skipping self join check for "Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.833411 [DEBUG] consul: Skipping self join check for "Node 5d584cc8-7c9f-21fe-6570-3c1cdba8b8e7" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.834108 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-27d50492-93b1-4c69-685f-10e55ee710d9 (694.085382ms) from=127.0.0.1:59956
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:26.840427 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-27d50492-93b1-4c69-685f-10e55ee710d9 (1.420371ms) from=127.0.0.1:59934
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_host
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:26.940770 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-c635dd5b-9f79-32c1-679d-0efaa099c93a (234.098866ms) from=127.0.0.1:37542
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:26.946795 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-c635dd5b-9f79-32c1-679d-0efaa099c93a (996.693µs) from=127.0.0.1:37526
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields_with_cert_file
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:27.074943 [DEBUG] http: Request PUT /v1/acl/auth-method (227.150016ms) from=127.0.0.1:59934
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:27.086766 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-fe480f4b-ed59-c799-5e12-1cc488d719cd (1.249367ms) from=127.0.0.1:59962
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.165941 [DEBUG] http: Request PUT /v1/acl/auth-method (213.498654ms) from=127.0.0.1:37526
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.181521 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-2ece305b-e7d8-76ff-b433-53af3df1a6ac (1.411371ms) from=127.0.0.1:37546
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:27.467792 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-fe480f4b-ed59-c799-5e12-1cc488d719cd (376.608974ms) from=127.0.0.1:59962
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:27.473752 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-fe480f4b-ed59-c799-5e12-1cc488d719cd (1.2527ms) from=127.0.0.1:59934
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_ca_cert
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.699079 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-2ece305b-e7d8-76ff-b433-53af3df1a6ac (512.606576ms) from=127.0.0.1:37546
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.705479 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-2ece305b-e7d8-76ff-b433-53af3df1a6ac (1.208699ms) from=127.0.0.1:37526
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.708842 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.708950 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.709006 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:27.782306 [DEBUG] http: Request PUT /v1/acl/auth-method (294.546135ms) from=127.0.0.1:59934
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:27.791375 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-aa3c4f4f-c83c-92cd-3c96-fe7b6291c744 (1.266033ms) from=127.0.0.1:59966
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.889121 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.964834 [INFO] manager: shutting down
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.965566 [INFO] agent: consul server down
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.965677 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.965748 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.965908 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.966059 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.967168 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/30 18:53:27.967383 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand_k8s_noMerge (10.08s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_host (0.32s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_ca_cert (0.39s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_jwt (0.50s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields (0.50s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields_with_cert_file (0.76s)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.048851 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-aa3c4f4f-c83c-92cd-3c96-fe7b6291c744 (253.347043ms) from=127.0.0.1:59966
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.054299 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-aa3c4f4f-c83c-92cd-3c96-fe7b6291c744 (1.031694ms) from=127.0.0.1:59934
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_jwt
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.223955 [DEBUG] http: Request PUT /v1/acl/auth-method (164.001677ms) from=127.0.0.1:59934
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.235500 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-5d60a19f-0d5b-4081-2579-6c70667d18fa (1.520707ms) from=127.0.0.1:59968
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.424779 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-5d60a19f-0d5b-4081-2579-6c70667d18fa (184.297548ms) from=127.0.0.1:59968
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.431889 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-5d60a19f-0d5b-4081-2579-6c70667d18fa (2.277394ms) from=127.0.0.1:59934
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.435864 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.435980 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.436033 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.489183 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.539210 [INFO] manager: shutting down
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.540041 [INFO] agent: consul server down
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.540106 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.540168 [INFO] agent: Stopping DNS server 127.0.0.1:23513 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.540308 [INFO] agent: Stopping DNS server 127.0.0.1:23513 (udp)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.540475 [INFO] agent: Stopping HTTP server 127.0.0.1:23514 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.541676 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand_k8s - 2019/12/30 18:53:28.541881 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand_k8s (10.65s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields (2.53s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_with_cert_file (1.25s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_host (0.64s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_ca_cert (0.57s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_jwt (0.38s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/update	11.297s
?   	github.com/hashicorp/consul/command/acl/bindingrule	[no test files]
=== RUN   TestBindingRuleCreateCommand_noTabs
=== PAUSE TestBindingRuleCreateCommand_noTabs
=== RUN   TestBindingRuleCreateCommand
=== PAUSE TestBindingRuleCreateCommand
=== CONT  TestBindingRuleCreateCommand_noTabs
=== CONT  TestBindingRuleCreateCommand
--- PASS: TestBindingRuleCreateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleCreateCommand - 2019/12/30 18:53:48.173429 [WARN] agent: Node name "Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleCreateCommand - 2019/12/30 18:53:48.174596 [DEBUG] tlsutil: Update with version 1
TestBindingRuleCreateCommand - 2019/12/30 18:53:48.190801 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:53:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9 Address:127.0.0.1:20506}]
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.654896 [INFO] serf: EventMemberJoin: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9.dc1 127.0.0.1
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.658331 [INFO] serf: EventMemberJoin: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9 127.0.0.1
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.661255 [INFO] consul: Adding LAN server Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9 (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.661527 [INFO] consul: Handled member-join event for server "Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9.dc1" in area "wan"
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.664904 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.664993 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.668596 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestBindingRuleCreateCommand - 2019/12/30 18:53:49.668910 [INFO] agent: started state syncer
2019/12/30 18:53:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:53:49 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/30 18:53:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:53:50 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestBindingRuleCreateCommand - 2019/12/30 18:53:50.482906 [INFO] consul: cluster leadership acquired
TestBindingRuleCreateCommand - 2019/12/30 18:53:50.483547 [INFO] consul: New leader elected: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9
TestBindingRuleCreateCommand - 2019/12/30 18:53:50.592733 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleCreateCommand - 2019/12/30 18:53:50.688953 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleCreateCommand - 2019/12/30 18:53:51.212472 [INFO] acl: initializing acls
TestBindingRuleCreateCommand - 2019/12/30 18:53:51.526712 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleCreateCommand - 2019/12/30 18:53:51.526790 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleCreateCommand - 2019/12/30 18:53:51.527332 [INFO] acl: initializing acls
TestBindingRuleCreateCommand - 2019/12/30 18:53:51.527438 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.083038 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.083133 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.250319 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.250473 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.251661 [INFO] serf: EventMemberUpdate: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.252346 [INFO] serf: EventMemberUpdate: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9.dc1
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.507558 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.508598 [INFO] serf: EventMemberUpdate: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9
TestBindingRuleCreateCommand - 2019/12/30 18:53:52.509515 [INFO] serf: EventMemberUpdate: Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9.dc1
TestBindingRuleCreateCommand - 2019/12/30 18:53:54.891941 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.135273 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.135723 [DEBUG] consul: Skipping self join check for "Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9" since the cluster is too small
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.135827 [INFO] consul: member 'Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9' joined, marking health alive
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.350998 [DEBUG] consul: Skipping self join check for "Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9" since the cluster is too small
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.351634 [DEBUG] consul: Skipping self join check for "Node 526e5bfd-e8d4-98c5-23d6-5ab143e2f0f9" since the cluster is too small
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.628740 [DEBUG] http: Request PUT /v1/acl/auth-method (254.707408ms) from=127.0.0.1:32930
=== RUN   TestBindingRuleCreateCommand/method_is_required
=== RUN   TestBindingRuleCreateCommand/bind_type_required
=== RUN   TestBindingRuleCreateCommand/bind_name_required
=== RUN   TestBindingRuleCreateCommand/must_use_roughly_valid_selector
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.656208 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.659356 [ERR] http: Request PUT /v1/acl/binding-rule, error: invalid Binding Rule: Selector is invalid: 1:4 (3): no match found, expected: "!=", ".", "==", "[", [ \t\r\n] or [a-zA-Z0-9_] from=127.0.0.1:32932
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.660642 [DEBUG] http: Request PUT /v1/acl/binding-rule (5.788486ms) from=127.0.0.1:32932
=== RUN   TestBindingRuleCreateCommand/create_it_with_no_selector
TestBindingRuleCreateCommand - 2019/12/30 18:53:55.908217 [DEBUG] http: Request PUT /v1/acl/binding-rule (233.695852ms) from=127.0.0.1:32934
=== RUN   TestBindingRuleCreateCommand/create_it_with_a_match_selector
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.178125 [DEBUG] http: Request PUT /v1/acl/binding-rule (251.827998ms) from=127.0.0.1:32936
=== RUN   TestBindingRuleCreateCommand/create_it_with_type_role
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.523619 [DEBUG] http: Request PUT /v1/acl/binding-rule (337.01692ms) from=127.0.0.1:32938
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.534856 [INFO] agent: Requesting shutdown
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.534987 [INFO] consul: shutting down server
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.535039 [WARN] serf: Shutdown without a Leave
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.681392 [WARN] serf: Shutdown without a Leave
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.791393 [INFO] manager: shutting down
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.792165 [INFO] agent: consul server down
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.792231 [INFO] agent: shutdown complete
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.792304 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.792467 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.792647 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.793954 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleCreateCommand - 2019/12/30 18:53:56.794145 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleCreateCommand (8.72s)
    --- PASS: TestBindingRuleCreateCommand/method_is_required (0.00s)
    --- PASS: TestBindingRuleCreateCommand/bind_type_required (0.00s)
    --- PASS: TestBindingRuleCreateCommand/bind_name_required (0.01s)
    --- PASS: TestBindingRuleCreateCommand/must_use_roughly_valid_selector (0.02s)
    --- PASS: TestBindingRuleCreateCommand/create_it_with_no_selector (0.25s)
    --- PASS: TestBindingRuleCreateCommand/create_it_with_a_match_selector (0.26s)
    --- PASS: TestBindingRuleCreateCommand/create_it_with_type_role (0.35s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/create	9.025s
=== RUN   TestBindingRuleDeleteCommand_noTabs
=== PAUSE TestBindingRuleDeleteCommand_noTabs
=== RUN   TestBindingRuleDeleteCommand
=== PAUSE TestBindingRuleDeleteCommand
=== CONT  TestBindingRuleDeleteCommand_noTabs
=== CONT  TestBindingRuleDeleteCommand
--- PASS: TestBindingRuleDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleDeleteCommand - 2019/12/30 18:54:01.856478 [WARN] agent: Node name "Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleDeleteCommand - 2019/12/30 18:54:01.857437 [DEBUG] tlsutil: Update with version 1
TestBindingRuleDeleteCommand - 2019/12/30 18:54:01.881236 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:57ee7957-75f5-cdad-adaf-aaf24ba4743b Address:127.0.0.1:40006}]
2019/12/30 18:54:04 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.048676 [INFO] serf: EventMemberJoin: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b.dc1 127.0.0.1
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.058717 [INFO] serf: EventMemberJoin: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b 127.0.0.1
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.061133 [INFO] consul: Adding LAN server Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.061581 [INFO] consul: Handled member-join event for server "Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b.dc1" in area "wan"
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.063185 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.064782 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.069085 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:04.069240 [INFO] agent: started state syncer
2019/12/30 18:54:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:04 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:05 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestBindingRuleDeleteCommand - 2019/12/30 18:54:05.100620 [INFO] consul: cluster leadership acquired
TestBindingRuleDeleteCommand - 2019/12/30 18:54:05.101228 [INFO] consul: New leader elected: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b
TestBindingRuleDeleteCommand - 2019/12/30 18:54:05.179600 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleDeleteCommand - 2019/12/30 18:54:05.548872 [INFO] acl: initializing acls
TestBindingRuleDeleteCommand - 2019/12/30 18:54:05.612517 [INFO] acl: initializing acls
TestBindingRuleDeleteCommand - 2019/12/30 18:54:05.920378 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleDeleteCommand - 2019/12/30 18:54:07.424826 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleDeleteCommand - 2019/12/30 18:54:07.424916 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleDeleteCommand - 2019/12/30 18:54:07.425895 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleDeleteCommand - 2019/12/30 18:54:07.425964 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleDeleteCommand - 2019/12/30 18:54:07.701458 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.074967 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.075083 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.076044 [INFO] serf: EventMemberUpdate: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.077017 [INFO] serf: EventMemberUpdate: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b.dc1
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.077194 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.078313 [INFO] serf: EventMemberUpdate: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b
TestBindingRuleDeleteCommand - 2019/12/30 18:54:08.079752 [INFO] serf: EventMemberUpdate: Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b.dc1
TestBindingRuleDeleteCommand - 2019/12/30 18:54:09.432770 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleDeleteCommand - 2019/12/30 18:54:09.433416 [DEBUG] consul: Skipping self join check for "Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b" since the cluster is too small
TestBindingRuleDeleteCommand - 2019/12/30 18:54:09.433539 [INFO] consul: member 'Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b' joined, marking health alive
TestBindingRuleDeleteCommand - 2019/12/30 18:54:09.687043 [DEBUG] consul: Skipping self join check for "Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b" since the cluster is too small
TestBindingRuleDeleteCommand - 2019/12/30 18:54:09.687521 [DEBUG] consul: Skipping self join check for "Node 57ee7957-75f5-cdad-adaf-aaf24ba4743b" since the cluster is too small
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.026784 [DEBUG] http: Request PUT /v1/acl/auth-method (329.237045ms) from=127.0.0.1:46824
=== RUN   TestBindingRuleDeleteCommand/id_required
=== RUN   TestBindingRuleDeleteCommand/delete_works
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.035385 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.301010 [DEBUG] http: Request PUT /v1/acl/binding-rule (266.965064ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.582461 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.583828 [DEBUG] http: Request DELETE /v1/acl/binding-rule/681c5664-6e53-bf97-ddc7-3e10c0fec0d3 (261.562921ms) from=127.0.0.1:46826
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.587034 [DEBUG] http: Request GET /v1/acl/binding-rule/681c5664-6e53-bf97-ddc7-3e10c0fec0d3 (559.348µs) from=127.0.0.1:46824
=== RUN   TestBindingRuleDeleteCommand/delete_works_via_prefixes
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.759776 [DEBUG] http: Request PUT /v1/acl/binding-rule (169.918162ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.770277 [DEBUG] http: Request GET /v1/acl/binding-rules (1.76538ms) from=127.0.0.1:46828
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.976922 [DEBUG] http: Request DELETE /v1/acl/binding-rule/ea522544-9111-43ba-6019-6bd490ab730f (203.114041ms) from=127.0.0.1:46828
TestBindingRuleDeleteCommand - 2019/12/30 18:54:10.979966 [DEBUG] http: Request GET /v1/acl/binding-rule/ea522544-9111-43ba-6019-6bd490ab730f (557.681µs) from=127.0.0.1:46824
=== RUN   TestBindingRuleDeleteCommand/delete_fails_when_prefix_matches_more_than_one_rule
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.001901 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (18.322818ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.319895 [DEBUG] http: Request PUT /v1/acl/binding-rule (312.918946ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.323708 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.005693ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.501990 [DEBUG] http: Request PUT /v1/acl/binding-rule (175.682981ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.510074 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (2.205725ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.808528 [DEBUG] http: Request PUT /v1/acl/binding-rule (295.813827ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:11.814208 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.017694ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.037959 [DEBUG] http: Request PUT /v1/acl/binding-rule (218.57845ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.042664 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.730046ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.260785 [DEBUG] http: Request PUT /v1/acl/binding-rule (214.377339ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.265843 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.169364ms) from=127.0.0.1:46824
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.273770 [DEBUG] http: Request GET /v1/acl/binding-rules (1.156031ms) from=127.0.0.1:46830
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.276353 [INFO] agent: Requesting shutdown
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.276439 [INFO] consul: shutting down server
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.276490 [WARN] serf: Shutdown without a Leave
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.367116 [WARN] serf: Shutdown without a Leave
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.469934 [INFO] manager: shutting down
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.470775 [INFO] agent: consul server down
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.470848 [INFO] agent: shutdown complete
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.470909 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.471079 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.471241 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.471900 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleDeleteCommand - 2019/12/30 18:54:12.471988 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleDeleteCommand (10.71s)
    --- PASS: TestBindingRuleDeleteCommand/id_required (0.00s)
    --- PASS: TestBindingRuleDeleteCommand/delete_works (0.56s)
    --- PASS: TestBindingRuleDeleteCommand/delete_works_via_prefixes (0.39s)
    --- PASS: TestBindingRuleDeleteCommand/delete_fails_when_prefix_matches_more_than_one_rule (1.30s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/delete	11.017s
=== RUN   TestBindingRuleListCommand_noTabs
=== PAUSE TestBindingRuleListCommand_noTabs
=== RUN   TestBindingRuleListCommand
=== PAUSE TestBindingRuleListCommand
=== CONT  TestBindingRuleListCommand_noTabs
=== CONT  TestBindingRuleListCommand
--- PASS: TestBindingRuleListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleListCommand - 2019/12/30 18:54:15.164145 [WARN] agent: Node name "Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleListCommand - 2019/12/30 18:54:15.165218 [DEBUG] tlsutil: Update with version 1
TestBindingRuleListCommand - 2019/12/30 18:54:15.172263 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e63c1a1a-c1e2-f2e4-f998-2a694cbc4551 Address:127.0.0.1:28006}]
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:28006 [Follower] entering Follower state (Leader: "")
TestBindingRuleListCommand - 2019/12/30 18:54:16.213249 [INFO] serf: EventMemberJoin: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551.dc1 127.0.0.1
TestBindingRuleListCommand - 2019/12/30 18:54:16.217524 [INFO] serf: EventMemberJoin: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551 127.0.0.1
TestBindingRuleListCommand - 2019/12/30 18:54:16.219474 [INFO] consul: Adding LAN server Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551 (Addr: tcp/127.0.0.1:28006) (DC: dc1)
TestBindingRuleListCommand - 2019/12/30 18:54:16.219805 [INFO] agent: Started DNS server 127.0.0.1:28001 (udp)
TestBindingRuleListCommand - 2019/12/30 18:54:16.220169 [INFO] consul: Handled member-join event for server "Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551.dc1" in area "wan"
TestBindingRuleListCommand - 2019/12/30 18:54:16.220504 [INFO] agent: Started DNS server 127.0.0.1:28001 (tcp)
TestBindingRuleListCommand - 2019/12/30 18:54:16.223094 [INFO] agent: Started HTTP server on 127.0.0.1:28002 (tcp)
TestBindingRuleListCommand - 2019/12/30 18:54:16.223251 [INFO] agent: started state syncer
2019/12/30 18:54:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:28006 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:16 [INFO]  raft: Node at 127.0.0.1:28006 [Leader] entering Leader state
TestBindingRuleListCommand - 2019/12/30 18:54:16.815650 [INFO] consul: cluster leadership acquired
TestBindingRuleListCommand - 2019/12/30 18:54:16.816305 [INFO] consul: New leader elected: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551
TestBindingRuleListCommand - 2019/12/30 18:54:16.971927 [INFO] acl: initializing acls
TestBindingRuleListCommand - 2019/12/30 18:54:17.090325 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleListCommand - 2019/12/30 18:54:17.265609 [INFO] acl: initializing acls
TestBindingRuleListCommand - 2019/12/30 18:54:17.266279 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleListCommand - 2019/12/30 18:54:17.266353 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleListCommand - 2019/12/30 18:54:17.408869 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleListCommand - 2019/12/30 18:54:17.408951 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleListCommand - 2019/12/30 18:54:17.966871 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleListCommand - 2019/12/30 18:54:17.966888 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleListCommand - 2019/12/30 18:54:18.110296 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleListCommand - 2019/12/30 18:54:18.111271 [INFO] serf: EventMemberUpdate: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551
TestBindingRuleListCommand - 2019/12/30 18:54:18.111846 [INFO] serf: EventMemberUpdate: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551.dc1
TestBindingRuleListCommand - 2019/12/30 18:54:18.260381 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleListCommand - 2019/12/30 18:54:18.260470 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleListCommand - 2019/12/30 18:54:18.261398 [INFO] serf: EventMemberUpdate: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551
TestBindingRuleListCommand - 2019/12/30 18:54:18.262267 [INFO] serf: EventMemberUpdate: Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551.dc1
TestBindingRuleListCommand - 2019/12/30 18:54:18.416434 [INFO] agent: Synced node info
TestBindingRuleListCommand - 2019/12/30 18:54:18.416559 [DEBUG] agent: Node info in sync
TestBindingRuleListCommand - 2019/12/30 18:54:18.760426 [DEBUG] http: Request PUT /v1/acl/auth-method (329.689055ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:19.178937 [DEBUG] http: Request PUT /v1/acl/auth-method (413.603941ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:19.183656 [DEBUG] acl: updating cached auth method validator for "test-1"
TestBindingRuleListCommand - 2019/12/30 18:54:19.352979 [DEBUG] http: Request PUT /v1/acl/binding-rule (170.355506ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:19.360679 [DEBUG] acl: updating cached auth method validator for "test-2"
TestBindingRuleListCommand - 2019/12/30 18:54:20.058276 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleListCommand - 2019/12/30 18:54:20.058863 [DEBUG] consul: Skipping self join check for "Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551" since the cluster is too small
TestBindingRuleListCommand - 2019/12/30 18:54:20.059055 [INFO] consul: member 'Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551' joined, marking health alive
TestBindingRuleListCommand - 2019/12/30 18:54:20.059074 [DEBUG] http: Request PUT /v1/acl/binding-rule (700.962876ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:20.529898 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleListCommand - 2019/12/30 18:54:20.531196 [DEBUG] consul: Skipping self join check for "Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551" since the cluster is too small
TestBindingRuleListCommand - 2019/12/30 18:54:20.531841 [DEBUG] consul: Skipping self join check for "Node e63c1a1a-c1e2-f2e4-f998-2a694cbc4551" since the cluster is too small
TestBindingRuleListCommand - 2019/12/30 18:54:20.542407 [DEBUG] http: Request PUT /v1/acl/binding-rule (478.578326ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:20.893542 [DEBUG] http: Request PUT /v1/acl/binding-rule (345.725145ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:21.083860 [DEBUG] http: Request PUT /v1/acl/binding-rule (186.834276ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:21.259238 [DEBUG] http: Request PUT /v1/acl/binding-rule (170.859186ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:21.560253 [DEBUG] http: Request PUT /v1/acl/binding-rule (289.69433ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:21.784134 [DEBUG] http: Request PUT /v1/acl/binding-rule (216.955072ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:21.958999 [DEBUG] http: Request PUT /v1/acl/binding-rule (171.612539ms) from=127.0.0.1:44504
TestBindingRuleListCommand - 2019/12/30 18:54:22.169866 [DEBUG] http: Request PUT /v1/acl/binding-rule (206.795803ms) from=127.0.0.1:44504
=== RUN   TestBindingRuleListCommand/normal
TestBindingRuleListCommand - 2019/12/30 18:54:22.177864 [DEBUG] http: Request GET /v1/acl/binding-rules (1.691044ms) from=127.0.0.1:44506
=== RUN   TestBindingRuleListCommand/filter_by_method_1
TestBindingRuleListCommand - 2019/12/30 18:54:22.191123 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test-1 (1.88105ms) from=127.0.0.1:44508
=== RUN   TestBindingRuleListCommand/filter_by_method_2
TestBindingRuleListCommand - 2019/12/30 18:54:22.204622 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test-2 (1.815715ms) from=127.0.0.1:44510
TestBindingRuleListCommand - 2019/12/30 18:54:22.208918 [INFO] agent: Requesting shutdown
TestBindingRuleListCommand - 2019/12/30 18:54:22.209047 [INFO] consul: shutting down server
TestBindingRuleListCommand - 2019/12/30 18:54:22.209103 [WARN] serf: Shutdown without a Leave
TestBindingRuleListCommand - 2019/12/30 18:54:22.358720 [WARN] serf: Shutdown without a Leave
TestBindingRuleListCommand - 2019/12/30 18:54:22.540692 [INFO] manager: shutting down
TestBindingRuleListCommand - 2019/12/30 18:54:22.541591 [INFO] agent: consul server down
TestBindingRuleListCommand - 2019/12/30 18:54:22.541655 [INFO] agent: shutdown complete
TestBindingRuleListCommand - 2019/12/30 18:54:22.541716 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (tcp)
TestBindingRuleListCommand - 2019/12/30 18:54:22.541913 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (udp)
TestBindingRuleListCommand - 2019/12/30 18:54:22.542109 [INFO] agent: Stopping HTTP server 127.0.0.1:28002 (tcp)
TestBindingRuleListCommand - 2019/12/30 18:54:22.543231 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleListCommand - 2019/12/30 18:54:22.543375 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleListCommand (7.48s)
    --- PASS: TestBindingRuleListCommand/normal (0.01s)
    --- PASS: TestBindingRuleListCommand/filter_by_method_1 (0.01s)
    --- PASS: TestBindingRuleListCommand/filter_by_method_2 (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/list	7.812s
=== RUN   TestBindingRuleReadCommand_noTabs
=== PAUSE TestBindingRuleReadCommand_noTabs
=== RUN   TestBindingRuleReadCommand
=== PAUSE TestBindingRuleReadCommand
=== CONT  TestBindingRuleReadCommand_noTabs
=== CONT  TestBindingRuleReadCommand
--- PASS: TestBindingRuleReadCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleReadCommand - 2019/12/30 18:54:41.437996 [WARN] agent: Node name "Node 9458908c-f92e-38f5-7087-e4ce49ff077f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleReadCommand - 2019/12/30 18:54:41.438926 [DEBUG] tlsutil: Update with version 1
TestBindingRuleReadCommand - 2019/12/30 18:54:41.477882 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9458908c-f92e-38f5-7087-e4ce49ff077f Address:127.0.0.1:49006}]
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:49006 [Follower] entering Follower state (Leader: "")
TestBindingRuleReadCommand - 2019/12/30 18:54:42.763457 [INFO] serf: EventMemberJoin: Node 9458908c-f92e-38f5-7087-e4ce49ff077f.dc1 127.0.0.1
TestBindingRuleReadCommand - 2019/12/30 18:54:42.775599 [INFO] serf: EventMemberJoin: Node 9458908c-f92e-38f5-7087-e4ce49ff077f 127.0.0.1
TestBindingRuleReadCommand - 2019/12/30 18:54:42.778652 [INFO] consul: Adding LAN server Node 9458908c-f92e-38f5-7087-e4ce49ff077f (Addr: tcp/127.0.0.1:49006) (DC: dc1)
TestBindingRuleReadCommand - 2019/12/30 18:54:42.785519 [INFO] consul: Handled member-join event for server "Node 9458908c-f92e-38f5-7087-e4ce49ff077f.dc1" in area "wan"
TestBindingRuleReadCommand - 2019/12/30 18:54:42.790540 [INFO] agent: Started DNS server 127.0.0.1:49001 (tcp)
TestBindingRuleReadCommand - 2019/12/30 18:54:42.790662 [INFO] agent: Started DNS server 127.0.0.1:49001 (udp)
TestBindingRuleReadCommand - 2019/12/30 18:54:42.797446 [INFO] agent: Started HTTP server on 127.0.0.1:49002 (tcp)
2019/12/30 18:54:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:42 [INFO]  raft: Node at 127.0.0.1:49006 [Candidate] entering Candidate state in term 2
TestBindingRuleReadCommand - 2019/12/30 18:54:42.800107 [INFO] agent: started state syncer
2019/12/30 18:54:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:54:43 [INFO]  raft: Node at 127.0.0.1:49006 [Leader] entering Leader state
TestBindingRuleReadCommand - 2019/12/30 18:54:43.539468 [INFO] consul: cluster leadership acquired
TestBindingRuleReadCommand - 2019/12/30 18:54:43.540127 [INFO] consul: New leader elected: Node 9458908c-f92e-38f5-7087-e4ce49ff077f
TestBindingRuleReadCommand - 2019/12/30 18:54:43.542918 [INFO] acl: initializing acls
TestBindingRuleReadCommand - 2019/12/30 18:54:43.770290 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleReadCommand - 2019/12/30 18:54:43.866753 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleReadCommand - 2019/12/30 18:54:43.866851 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleReadCommand - 2019/12/30 18:54:43.869672 [INFO] acl: initializing acls
TestBindingRuleReadCommand - 2019/12/30 18:54:43.869842 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleReadCommand - 2019/12/30 18:54:44.026806 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleReadCommand - 2019/12/30 18:54:44.301483 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleReadCommand - 2019/12/30 18:54:44.668336 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleReadCommand - 2019/12/30 18:54:44.669319 [INFO] serf: EventMemberUpdate: Node 9458908c-f92e-38f5-7087-e4ce49ff077f
TestBindingRuleReadCommand - 2019/12/30 18:54:44.671963 [INFO] serf: EventMemberUpdate: Node 9458908c-f92e-38f5-7087-e4ce49ff077f.dc1
TestBindingRuleReadCommand - 2019/12/30 18:54:44.669342 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleReadCommand - 2019/12/30 18:54:44.674312 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleReadCommand - 2019/12/30 18:54:44.675145 [INFO] serf: EventMemberUpdate: Node 9458908c-f92e-38f5-7087-e4ce49ff077f
TestBindingRuleReadCommand - 2019/12/30 18:54:44.675770 [INFO] serf: EventMemberUpdate: Node 9458908c-f92e-38f5-7087-e4ce49ff077f.dc1
TestBindingRuleReadCommand - 2019/12/30 18:54:45.921901 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleReadCommand - 2019/12/30 18:54:45.923882 [DEBUG] consul: Skipping self join check for "Node 9458908c-f92e-38f5-7087-e4ce49ff077f" since the cluster is too small
TestBindingRuleReadCommand - 2019/12/30 18:54:45.924958 [INFO] consul: member 'Node 9458908c-f92e-38f5-7087-e4ce49ff077f' joined, marking health alive
TestBindingRuleReadCommand - 2019/12/30 18:54:46.270651 [DEBUG] consul: Skipping self join check for "Node 9458908c-f92e-38f5-7087-e4ce49ff077f" since the cluster is too small
TestBindingRuleReadCommand - 2019/12/30 18:54:46.271258 [DEBUG] consul: Skipping self join check for "Node 9458908c-f92e-38f5-7087-e4ce49ff077f" since the cluster is too small
TestBindingRuleReadCommand - 2019/12/30 18:54:46.443701 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestBindingRuleReadCommand - 2019/12/30 18:54:47.276790 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleReadCommand - 2019/12/30 18:54:47.278825 [INFO] agent: Synced node info
TestBindingRuleReadCommand - 2019/12/30 18:54:47.278941 [DEBUG] agent: Node info in sync
TestBindingRuleReadCommand - 2019/12/30 18:54:47.280899 [DEBUG] http: Request PUT /v1/acl/auth-method (989.620829ms) from=127.0.0.1:48730
=== RUN   TestBindingRuleReadCommand/id_required
=== RUN   TestBindingRuleReadCommand/read_by_id_not_found
TestBindingRuleReadCommand - 2019/12/30 18:54:47.291787 [DEBUG] http: Request GET /v1/acl/binding-rule/89e39f88-08ae-488e-82f7-dddd31b804ea (770.687µs) from=127.0.0.1:48732
=== RUN   TestBindingRuleReadCommand/read_by_id
TestBindingRuleReadCommand - 2019/12/30 18:54:47.295567 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleReadCommand - 2019/12/30 18:54:47.713040 [DEBUG] http: Request PUT /v1/acl/binding-rule (418.588733ms) from=127.0.0.1:48730
TestBindingRuleReadCommand - 2019/12/30 18:54:47.727603 [DEBUG] http: Request GET /v1/acl/binding-rule/9df9f5ef-b5ce-82c2-2ccf-5cd0b1706ff5 (1.876383ms) from=127.0.0.1:48734
=== RUN   TestBindingRuleReadCommand/read_by_id_prefix
TestBindingRuleReadCommand - 2019/12/30 18:54:48.376603 [DEBUG] http: Request PUT /v1/acl/binding-rule (643.964358ms) from=127.0.0.1:48730
TestBindingRuleReadCommand - 2019/12/30 18:54:48.385931 [DEBUG] http: Request GET /v1/acl/binding-rules (1.860716ms) from=127.0.0.1:48736
TestBindingRuleReadCommand - 2019/12/30 18:54:48.392427 [DEBUG] http: Request GET /v1/acl/binding-rule/9eb3a04a-81dc-fe6f-5563-44aa4fee0025 (1.019693ms) from=127.0.0.1:48736
TestBindingRuleReadCommand - 2019/12/30 18:54:48.394848 [INFO] agent: Requesting shutdown
TestBindingRuleReadCommand - 2019/12/30 18:54:48.394949 [INFO] consul: shutting down server
TestBindingRuleReadCommand - 2019/12/30 18:54:48.395287 [WARN] serf: Shutdown without a Leave
TestBindingRuleReadCommand - 2019/12/30 18:54:48.600026 [WARN] serf: Shutdown without a Leave
TestBindingRuleReadCommand - 2019/12/30 18:54:48.666928 [INFO] manager: shutting down
TestBindingRuleReadCommand - 2019/12/30 18:54:48.667867 [INFO] agent: consul server down
TestBindingRuleReadCommand - 2019/12/30 18:54:48.667938 [INFO] agent: shutdown complete
TestBindingRuleReadCommand - 2019/12/30 18:54:48.668000 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (tcp)
TestBindingRuleReadCommand - 2019/12/30 18:54:48.668167 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (udp)
TestBindingRuleReadCommand - 2019/12/30 18:54:48.668335 [INFO] agent: Stopping HTTP server 127.0.0.1:49002 (tcp)
TestBindingRuleReadCommand - 2019/12/30 18:54:48.669343 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleReadCommand - 2019/12/30 18:54:48.671617 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleReadCommand (7.35s)
    --- PASS: TestBindingRuleReadCommand/id_required (0.00s)
    --- PASS: TestBindingRuleReadCommand/read_by_id_not_found (0.01s)
    --- PASS: TestBindingRuleReadCommand/read_by_id (0.44s)
    --- PASS: TestBindingRuleReadCommand/read_by_id_prefix (0.66s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/read	7.687s
=== RUN   TestBindingRuleUpdateCommand_noTabs
=== PAUSE TestBindingRuleUpdateCommand_noTabs
=== RUN   TestBindingRuleUpdateCommand
=== PAUSE TestBindingRuleUpdateCommand
=== RUN   TestBindingRuleUpdateCommand_noMerge
=== PAUSE TestBindingRuleUpdateCommand_noMerge
=== CONT  TestBindingRuleUpdateCommand_noTabs
=== CONT  TestBindingRuleUpdateCommand_noMerge
=== CONT  TestBindingRuleUpdateCommand
--- PASS: TestBindingRuleUpdateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:58.677502 [WARN] agent: Node name "Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:58.678368 [DEBUG] tlsutil: Update with version 1
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:58.687099 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleUpdateCommand - 2019/12/30 18:54:58.701577 [WARN] agent: Node name "Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleUpdateCommand - 2019/12/30 18:54:58.702826 [DEBUG] tlsutil: Update with version 1
TestBindingRuleUpdateCommand - 2019/12/30 18:54:58.705895 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:54:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2a2b07e1-eab4-e967-3670-6f8c376dbd42 Address:127.0.0.1:37006}]
2019/12/30 18:54:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4745dcec-8e1f-a9f2-0196-d8af81f4339d Address:127.0.0.1:37012}]
2019/12/30 18:54:59 [INFO]  raft: Node at 127.0.0.1:37006 [Follower] entering Follower state (Leader: "")
2019/12/30 18:54:59 [INFO]  raft: Node at 127.0.0.1:37012 [Follower] entering Follower state (Leader: "")
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.628504 [INFO] serf: EventMemberJoin: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42.dc1 127.0.0.1
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.630136 [INFO] serf: EventMemberJoin: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d.dc1 127.0.0.1
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.647067 [INFO] serf: EventMemberJoin: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d 127.0.0.1
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.647280 [INFO] serf: EventMemberJoin: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42 127.0.0.1
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.652940 [INFO] consul: Adding LAN server Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d (Addr: tcp/127.0.0.1:37012) (DC: dc1)
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.654148 [INFO] consul: Handled member-join event for server "Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d.dc1" in area "wan"
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.654628 [INFO] agent: Started DNS server 127.0.0.1:37007 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.654827 [INFO] consul: Handled member-join event for server "Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42.dc1" in area "wan"
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.655293 [INFO] agent: Started DNS server 127.0.0.1:37001 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.656100 [INFO] agent: Started DNS server 127.0.0.1:37001 (udp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.658528 [INFO] consul: Adding LAN server Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42 (Addr: tcp/127.0.0.1:37006) (DC: dc1)
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.659317 [INFO] agent: Started DNS server 127.0.0.1:37007 (udp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.661483 [INFO] agent: Started HTTP server on 127.0.0.1:37002 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:54:59.661649 [INFO] agent: started state syncer
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.663046 [INFO] agent: Started HTTP server on 127.0.0.1:37008 (tcp)
TestBindingRuleUpdateCommand - 2019/12/30 18:54:59.663326 [INFO] agent: started state syncer
2019/12/30 18:54:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:59 [INFO]  raft: Node at 127.0.0.1:37012 [Candidate] entering Candidate state in term 2
2019/12/30 18:54:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:54:59 [INFO]  raft: Node at 127.0.0.1:37006 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:01 [INFO]  raft: Node at 127.0.0.1:37006 [Leader] entering Leader state
2019/12/30 18:55:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:01 [INFO]  raft: Node at 127.0.0.1:37012 [Leader] entering Leader state
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:01.743085 [INFO] consul: cluster leadership acquired
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:01.743826 [INFO] consul: New leader elected: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42
TestBindingRuleUpdateCommand - 2019/12/30 18:55:01.744652 [INFO] consul: cluster leadership acquired
TestBindingRuleUpdateCommand - 2019/12/30 18:55:01.745177 [INFO] consul: New leader elected: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:01.778980 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleUpdateCommand - 2019/12/30 18:55:01.817046 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:01.977067 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:02.477572 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand - 2019/12/30 18:55:02.593054 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:02.753366 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:02.753455 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand - 2019/12/30 18:55:02.805151 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:02.806621 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:02.806739 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand - 2019/12/30 18:55:02.890703 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleUpdateCommand - 2019/12/30 18:55:02.890821 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand - 2019/12/30 18:55:03.075348 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleUpdateCommand - 2019/12/30 18:55:03.075490 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:03.553827 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:03.559460 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand - 2019/12/30 18:55:03.859459 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand - 2019/12/30 18:55:03.866258 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.092884 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.092989 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.093394 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.093798 [INFO] serf: EventMemberUpdate: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.094498 [INFO] serf: EventMemberUpdate: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42.dc1
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.095105 [INFO] serf: EventMemberUpdate: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:04.096520 [INFO] serf: EventMemberUpdate: Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42.dc1
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.268037 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.268166 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.268467 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.269013 [INFO] serf: EventMemberUpdate: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.269996 [INFO] serf: EventMemberUpdate: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d.dc1
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.270777 [INFO] serf: EventMemberUpdate: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.271458 [INFO] serf: EventMemberUpdate: Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d.dc1
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.717591 [INFO] agent: Synced node info
TestBindingRuleUpdateCommand - 2019/12/30 18:55:04.717716 [DEBUG] agent: Node info in sync
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:05.288612 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:05.289344 [DEBUG] consul: Skipping self join check for "Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42" since the cluster is too small
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:05.289669 [INFO] consul: member 'Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42' joined, marking health alive
TestBindingRuleUpdateCommand - 2019/12/30 18:55:05.433710 [DEBUG] http: Request PUT /v1/acl/auth-method (710.449775ms) from=127.0.0.1:47076
=== RUN   TestBindingRuleUpdateCommand/rule_id_required
=== RUN   TestBindingRuleUpdateCommand/rule_id_partial_matches_nothing
TestBindingRuleUpdateCommand - 2019/12/30 18:55:05.446357 [DEBUG] http: Request GET /v1/acl/binding-rules (2.255727ms) from=127.0.0.1:47078
=== RUN   TestBindingRuleUpdateCommand/rule_id_exact_match_doesn't_exist
TestBindingRuleUpdateCommand - 2019/12/30 18:55:05.477422 [DEBUG] http: Request GET /v1/acl/binding-rule/d632ee44-8d63-8a50-fb22-e218fc304867 (1.600042ms) from=127.0.0.1:47080
=== RUN   TestBindingRuleUpdateCommand/rule_id_partial_matches_multiple
TestBindingRuleUpdateCommand - 2019/12/30 18:55:05.481514 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.966718ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand - 2019/12/30 18:55:05.486023 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:05.665904 [DEBUG] consul: Skipping self join check for "Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42" since the cluster is too small
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:05.666948 [DEBUG] consul: Skipping self join check for "Node 2a2b07e1-eab4-e967-3670-6f8c376dbd42" since the cluster is too small
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.025911 [DEBUG] http: Request PUT /v1/acl/auth-method (350.319592ms) from=127.0.0.1:58962
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_required
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_nothing
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.042179 [DEBUG] http: Request GET /v1/acl/binding-rules (2.055387ms) from=127.0.0.1:58964
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_exact_match_doesn't_exist
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.049211 [DEBUG] http: Request GET /v1/acl/binding-rule/4e9351ee-a912-db03-3414-0c414f325980 (603.349µs) from=127.0.0.1:58966
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_multiple
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.053511 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (2.14439ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.056281 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleUpdateCommand - 2019/12/30 18:55:06.459612 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleUpdateCommand - 2019/12/30 18:55:06.460347 [DEBUG] consul: Skipping self join check for "Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d" since the cluster is too small
TestBindingRuleUpdateCommand - 2019/12/30 18:55:06.460519 [INFO] consul: member 'Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d' joined, marking health alive
TestBindingRuleUpdateCommand - 2019/12/30 18:55:06.461620 [DEBUG] http: Request PUT /v1/acl/binding-rule (976.555808ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.465745 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleUpdateCommand - 2019/12/30 18:55:06.478822 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (13.241684ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.483503 [DEBUG] http: Request PUT /v1/acl/binding-rule (428.042645ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.508839 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.413038ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.846370 [DEBUG] http: Request PUT /v1/acl/binding-rule (327.370318ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:06.854862 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.102362ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.183327 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.185711 [DEBUG] consul: Skipping self join check for "Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d" since the cluster is too small
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.186484 [DEBUG] consul: Skipping self join check for "Node 4745dcec-8e1f-a9f2-0196-d8af81f4339d" since the cluster is too small
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.186756 [DEBUG] http: Request PUT /v1/acl/binding-rule (677.145228ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.190385 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.138363ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.268576 [DEBUG] http: Request PUT /v1/acl/binding-rule (411.139531ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.273648 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.348369ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.428540 [DEBUG] http: Request PUT /v1/acl/binding-rule (235.342552ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.432186 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.084695ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.685112 [DEBUG] http: Request PUT /v1/acl/binding-rule (408.706801ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.689128 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.091362ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.808255 [DEBUG] http: Request PUT /v1/acl/binding-rule (373.269864ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.813278 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.974719ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand - 2019/12/30 18:55:07.822876 [DEBUG] http: Request GET /v1/acl/binding-rules (2.75974ms) from=127.0.0.1:47088
=== RUN   TestBindingRuleUpdateCommand/must_use_roughly_valid_selector
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.962691 [DEBUG] http: Request PUT /v1/acl/binding-rule (266.223035ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.967978 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.393037ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:07.984829 [DEBUG] http: Request GET /v1/acl/binding-rules (1.932051ms) from=127.0.0.1:58970
=== RUN   TestBindingRuleUpdateCommand_noMerge/must_use_roughly_valid_selector
TestBindingRuleUpdateCommand - 2019/12/30 18:55:08.218044 [DEBUG] http: Request PUT /v1/acl/binding-rule (390.877662ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand - 2019/12/30 18:55:08.229033 [DEBUG] http: Request GET /v1/acl/binding-rule/d4faff75-a08f-38c5-4165-501225b51aa6 (1.249366ms) from=127.0.0.1:47092
TestBindingRuleUpdateCommand - 2019/12/30 18:55:08.239676 [ERR] http: Request PUT /v1/acl/binding-rule/d4faff75-a08f-38c5-4165-501225b51aa6, error: invalid Binding Rule: Selector is invalid: 1:4 (3): no match found, expected: "!=", ".", "==", "[", [ \t\r\n] or [a-zA-Z0-9_] from=127.0.0.1:47092
TestBindingRuleUpdateCommand - 2019/12/30 18:55:08.240724 [DEBUG] http: Request PUT /v1/acl/binding-rule/d4faff75-a08f-38c5-4165-501225b51aa6 (6.833848ms) from=127.0.0.1:47092
=== RUN   TestBindingRuleUpdateCommand/update_all_fields
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:08.380585 [DEBUG] http: Request PUT /v1/acl/binding-rule (390.775993ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:08.393327 [DEBUG] http: Request GET /v1/acl/binding-rule/2ddfae20-9834-3215-df95-877e5cc57c4c (1.091029ms) from=127.0.0.1:58974
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:08.398745 [ERR] http: Request PUT /v1/acl/binding-rule/2ddfae20-9834-3215-df95-877e5cc57c4c, error: invalid Binding Rule: Selector is invalid: 1:4 (3): no match found, expected: "!=", ".", "==", "[", [ \t\r\n] or [a-zA-Z0-9_] from=127.0.0.1:58974
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:08.400006 [DEBUG] http: Request PUT /v1/acl/binding-rule/2ddfae20-9834-3215-df95-877e5cc57c4c (4.154776ms) from=127.0.0.1:58974
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:08.776442 [DEBUG] http: Request PUT /v1/acl/binding-rule (373.307199ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand - 2019/12/30 18:55:08.783938 [DEBUG] http: Request PUT /v1/acl/binding-rule (539.250583ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:08.791491 [DEBUG] http: Request GET /v1/acl/binding-rule/6ad47ab7-1d6b-f3fc-8566-ab0be6b270ec (1.342369ms) from=127.0.0.1:58976
TestBindingRuleUpdateCommand - 2019/12/30 18:55:08.802017 [DEBUG] http: Request GET /v1/acl/binding-rule/39616cda-4a41-f9c8-7f43-edb612d20cf5 (1.103029ms) from=127.0.0.1:47098
TestBindingRuleUpdateCommand - 2019/12/30 18:55:09.168086 [DEBUG] http: Request PUT /v1/acl/binding-rule/39616cda-4a41-f9c8-7f43-edb612d20cf5 (363.088595ms) from=127.0.0.1:47098
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:09.170581 [DEBUG] http: Request PUT /v1/acl/binding-rule/6ad47ab7-1d6b-f3fc-8566-ab0be6b270ec (375.496923ms) from=127.0.0.1:58976
TestBindingRuleUpdateCommand - 2019/12/30 18:55:09.175136 [DEBUG] http: Request GET /v1/acl/binding-rule/39616cda-4a41-f9c8-7f43-edb612d20cf5 (920.024µs) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:09.175813 [DEBUG] http: Request GET /v1/acl/binding-rule/6ad47ab7-1d6b-f3fc-8566-ab0be6b270ec (1.679378ms) from=127.0.0.1:58962
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_-_partial
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields_-_partial
TestBindingRuleUpdateCommand - 2019/12/30 18:55:09.182610 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (4.260446ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:09.185955 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (7.523199ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:09.584731 [DEBUG] http: Request DELETE /v1/acl/binding-rule/1a7a534e-68e8-dfa1-19a7-f138cdde1619 (392.702044ms) from=127.0.0.1:58962
TestBindingRuleUpdateCommand - 2019/12/30 18:55:09.589038 [DEBUG] http: Request DELETE /v1/acl/binding-rule/39616cda-4a41-f9c8-7f43-edb612d20cf5 (391.861688ms) from=127.0.0.1:47076
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:09.951876 [DEBUG] http: Request DELETE /v1/acl/binding-rule/1f3e89e8-580d-1a7e-36b2-8420e4547545 (358.106463ms) from=127.0.0.1:58980
TestBindingRuleUpdateCommand - 2019/12/30 18:55:09.951876 [DEBUG] http: Request DELETE /v1/acl/binding-rule/c146ee5f-25d8-963a-ac9b-774a22e8d9f4 (359.424831ms) from=127.0.0.1:47102
TestBindingRuleUpdateCommand - 2019/12/30 18:55:10.259506 [DEBUG] http: Request DELETE /v1/acl/binding-rule/c3165f8e-6da2-3aab-1baa-60a9a9d8a469 (300.933285ms) from=127.0.0.1:47104
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:10.260781 [DEBUG] http: Request DELETE /v1/acl/binding-rule/2ddfae20-9834-3215-df95-877e5cc57c4c (303.252681ms) from=127.0.0.1:58986
TestBindingRuleUpdateCommand - 2019/12/30 18:55:10.518327 [DEBUG] http: Request DELETE /v1/acl/binding-rule/d1f27206-8910-7821-a810-4534c99dec94 (252.470005ms) from=127.0.0.1:47108
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:10.518424 [DEBUG] http: Request DELETE /v1/acl/binding-rule/6ad47ab7-1d6b-f3fc-8566-ab0be6b270ec (252.43767ms) from=127.0.0.1:58988
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:10.801822 [DEBUG] http: Request DELETE /v1/acl/binding-rule/7b07eb0c-7425-a767-24d5-cdae7b52cc39 (276.754646ms) from=127.0.0.1:58994
TestBindingRuleUpdateCommand - 2019/12/30 18:55:10.801869 [DEBUG] http: Request DELETE /v1/acl/binding-rule/d4faff75-a08f-38c5-4165-501225b51aa6 (276.755647ms) from=127.0.0.1:47112
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.119135 [DEBUG] http: Request DELETE /v1/acl/binding-rule/85e3d6c9-f4a1-3dca-ee2e-175e66e41486 (307.221452ms) from=127.0.0.1:58998
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.122162 [DEBUG] http: Request DELETE /v1/acl/binding-rule/ed47ede5-74c5-d9ea-fe0a-24dcd19240af (311.517232ms) from=127.0.0.1:47116
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.334615 [DEBUG] http: Request DELETE /v1/acl/binding-rule/cb548bcb-9381-e360-6260-ccf52addb0fc (212.212274ms) from=127.0.0.1:59000
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.337241 [DEBUG] http: Request PUT /v1/acl/binding-rule (211.262249ms) from=127.0.0.1:47122
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.345404 [DEBUG] http: Request GET /v1/acl/binding-rules (1.190364ms) from=127.0.0.1:47126
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.351188 [DEBUG] http: Request GET /v1/acl/binding-rule/55fffc4c-acf2-7871-382d-324e855e8d33 (1.667711ms) from=127.0.0.1:47126
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.618894 [DEBUG] http: Request PUT /v1/acl/binding-rule/55fffc4c-acf2-7871-382d-324e855e8d33 (264.355319ms) from=127.0.0.1:47126
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.618894 [DEBUG] http: Request PUT /v1/acl/binding-rule (276.501307ms) from=127.0.0.1:59004
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.635017 [DEBUG] http: Request GET /v1/acl/binding-rule/55fffc4c-acf2-7871-382d-324e855e8d33 (1.445372ms) from=127.0.0.1:47122
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_description
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.654775 [DEBUG] http: Request GET /v1/acl/binding-rules (1.316702ms) from=127.0.0.1:59008
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.658376 [DEBUG] http: Request GET /v1/acl/binding-rule/fa6beebf-532c-f3a8-abf8-a0d3797ef1ff (860.356µs) from=127.0.0.1:59008
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.977265 [DEBUG] http: Request PUT /v1/acl/binding-rule/fa6beebf-532c-f3a8-abf8-a0d3797ef1ff (314.689982ms) from=127.0.0.1:59008
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.977725 [DEBUG] http: Request PUT /v1/acl/binding-rule (335.010853ms) from=127.0.0.1:47122
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:11.990284 [DEBUG] http: Request GET /v1/acl/binding-rule/fa6beebf-532c-f3a8-abf8-a0d3797ef1ff (1.233032ms) from=127.0.0.1:59004
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_description
TestBindingRuleUpdateCommand - 2019/12/30 18:55:11.995774 [DEBUG] http: Request GET /v1/acl/binding-rule/d8cc7726-11e0-7ed3-4c65-7451889ed1c1 (1.132363ms) from=127.0.0.1:47130
TestBindingRuleUpdateCommand - 2019/12/30 18:55:12.236954 [DEBUG] http: Request PUT /v1/acl/binding-rule/d8cc7726-11e0-7ed3-4c65-7451889ed1c1 (236.222575ms) from=127.0.0.1:47130
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:12.239674 [DEBUG] http: Request PUT /v1/acl/binding-rule (246.156505ms) from=127.0.0.1:59004
TestBindingRuleUpdateCommand - 2019/12/30 18:55:12.246887 [DEBUG] http: Request GET /v1/acl/binding-rule/d8cc7726-11e0-7ed3-4c65-7451889ed1c1 (1.102029ms) from=127.0.0.1:47122
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_bind_name
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:12.258418 [DEBUG] http: Request GET /v1/acl/binding-rule/acfd77bc-dd95-70e9-5201-05e502b40ad9 (1.393037ms) from=127.0.0.1:59012
TestBindingRuleUpdateCommand - 2019/12/30 18:55:12.593400 [DEBUG] http: Request PUT /v1/acl/binding-rule (343.108733ms) from=127.0.0.1:47122
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:12.594885 [DEBUG] http: Request PUT /v1/acl/binding-rule/acfd77bc-dd95-70e9-5201-05e502b40ad9 (333.215805ms) from=127.0.0.1:59012
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:12.603694 [DEBUG] http: Request GET /v1/acl/binding-rule/acfd77bc-dd95-70e9-5201-05e502b40ad9 (1.188032ms) from=127.0.0.1:59004
TestBindingRuleUpdateCommand - 2019/12/30 18:55:12.606358 [DEBUG] http: Request GET /v1/acl/binding-rule/5c54ebb9-ee5d-926b-ff61-a2c6fb197883 (1.478372ms) from=127.0.0.1:47134
=== RUN   TestBindingRuleUpdateCommand_noMerge/missing_bind_name
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:12.901822 [DEBUG] http: Request PUT /v1/acl/binding-rule (291.861045ms) from=127.0.0.1:59004
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:12.909335 [DEBUG] http: Request GET /v1/acl/binding-rule/9a2939dc-198b-2f61-aba0-c9b7dfa296e8 (1.2337ms) from=127.0.0.1:59016
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_selector
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.029022 [DEBUG] http: Request PUT /v1/acl/binding-rule/5c54ebb9-ee5d-926b-ff61-a2c6fb197883 (415.540314ms) from=127.0.0.1:47134
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.032533 [DEBUG] http: Request GET /v1/acl/binding-rule/5c54ebb9-ee5d-926b-ff61-a2c6fb197883 (865.023µs) from=127.0.0.1:47122
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_must_exist
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.137151 [DEBUG] http: Request PUT /v1/acl/binding-rule (223.683911ms) from=127.0.0.1:59004
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.152654 [DEBUG] http: Request GET /v1/acl/binding-rule/934028b3-c1ed-203a-743d-cb77df40ee92 (1.62371ms) from=127.0.0.1:59018
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.263736 [DEBUG] http: Request PUT /v1/acl/binding-rule (228.182696ms) from=127.0.0.1:47122
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.276273 [DEBUG] http: Request GET /v1/acl/binding-rule/d9f204a3-2f7f-a638-9eab-14c1a3f2bd49 (1.853716ms) from=127.0.0.1:47140
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.393232 [DEBUG] http: Request PUT /v1/acl/binding-rule/934028b3-c1ed-203a-743d-cb77df40ee92 (235.41822ms) from=127.0.0.1:59018
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.397823 [DEBUG] http: Request GET /v1/acl/binding-rule/934028b3-c1ed-203a-743d-cb77df40ee92 (1.175031ms) from=127.0.0.1:59004
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.402021 [INFO] agent: Requesting shutdown
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.402216 [INFO] consul: shutting down server
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.402314 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.483403 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.633212 [INFO] manager: shutting down
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.634152 [INFO] agent: consul server down
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.634226 [INFO] agent: shutdown complete
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.634291 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.634583 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (udp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.634840 [INFO] agent: Stopping HTTP server 127.0.0.1:37002 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.637142 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleUpdateCommand_noMerge - 2019/12/30 18:55:13.637311 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleUpdateCommand_noMerge (15.06s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_required (0.00s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_nothing (0.01s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_exact_match_doesn't_exist (0.01s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_multiple (1.94s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/must_use_roughly_valid_selector (0.41s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields (0.78s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields_-_partial (2.81s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_description (0.61s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/missing_bind_name (0.30s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_selector (0.49s)
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.638739 [DEBUG] http: Request PUT /v1/acl/binding-rule/d9f204a3-2f7f-a638-9eab-14c1a3f2bd49 (358.172798ms) from=127.0.0.1:47140
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.644512 [DEBUG] http: Request GET /v1/acl/binding-rule/d9f204a3-2f7f-a638-9eab-14c1a3f2bd49 (1.01836ms) from=127.0.0.1:47122
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_selector
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.937017 [DEBUG] http: Request PUT /v1/acl/binding-rule (288.134947ms) from=127.0.0.1:47122
TestBindingRuleUpdateCommand - 2019/12/30 18:55:13.944783 [DEBUG] http: Request GET /v1/acl/binding-rule/a12683f8-7f35-decf-f5a6-abd1598ab9cf (1.160364ms) from=127.0.0.1:47142
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.270680 [DEBUG] http: Request PUT /v1/acl/binding-rule/a12683f8-7f35-decf-f5a6-abd1598ab9cf (322.83853ms) from=127.0.0.1:47142
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.274933 [DEBUG] http: Request GET /v1/acl/binding-rule/a12683f8-7f35-decf-f5a6-abd1598ab9cf (1.090029ms) from=127.0.0.1:47122
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_clear_selector
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.493512 [DEBUG] http: Request PUT /v1/acl/binding-rule (214.919679ms) from=127.0.0.1:47122
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.513194 [DEBUG] http: Request GET /v1/acl/binding-rule/a39b2d9d-37a9-3105-94fe-215f9d533f90 (1.630377ms) from=127.0.0.1:47146
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.810682 [DEBUG] http: Request PUT /v1/acl/binding-rule/a39b2d9d-37a9-3105-94fe-215f9d533f90 (293.409419ms) from=127.0.0.1:47146
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.817062 [DEBUG] http: Request GET /v1/acl/binding-rule/a39b2d9d-37a9-3105-94fe-215f9d533f90 (855.356µs) from=127.0.0.1:47122
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.818958 [INFO] agent: Requesting shutdown
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.819050 [INFO] consul: shutting down server
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.819098 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand - 2019/12/30 18:55:14.993593 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.108271 [INFO] manager: shutting down
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.109181 [INFO] agent: consul server down
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.109256 [INFO] agent: shutdown complete
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.109322 [INFO] agent: Stopping DNS server 127.0.0.1:37007 (tcp)
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.109582 [INFO] agent: Stopping DNS server 127.0.0.1:37007 (udp)
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.109782 [INFO] agent: Stopping HTTP server 127.0.0.1:37008 (tcp)
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.113769 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleUpdateCommand - 2019/12/30 18:55:15.113917 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleUpdateCommand (16.54s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_required (0.00s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_partial_matches_nothing (0.01s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_exact_match_doesn't_exist (0.03s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_partial_matches_multiple (2.35s)
    --- PASS: TestBindingRuleUpdateCommand/must_use_roughly_valid_selector (0.42s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields (0.93s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_-_partial (2.46s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_description (0.61s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_bind_name (0.79s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_must_exist (0.61s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_selector (0.63s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_clear_selector (0.54s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/update	16.851s
=== RUN   TestBootstrapCommand_noTabs
=== PAUSE TestBootstrapCommand_noTabs
=== RUN   TestBootstrapCommand
=== PAUSE TestBootstrapCommand
=== CONT  TestBootstrapCommand_noTabs
=== CONT  TestBootstrapCommand
--- PASS: TestBootstrapCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBootstrapCommand - 2019/12/30 18:55:08.277709 [WARN] agent: Node name "Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBootstrapCommand - 2019/12/30 18:55:08.278457 [DEBUG] tlsutil: Update with version 1
TestBootstrapCommand - 2019/12/30 18:55:08.284755 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d Address:127.0.0.1:10006}]
2019/12/30 18:55:09 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestBootstrapCommand - 2019/12/30 18:55:09.975886 [INFO] serf: EventMemberJoin: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d.dc1 127.0.0.1
TestBootstrapCommand - 2019/12/30 18:55:09.984277 [INFO] serf: EventMemberJoin: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d 127.0.0.1
TestBootstrapCommand - 2019/12/30 18:55:09.986620 [INFO] consul: Adding LAN server Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestBootstrapCommand - 2019/12/30 18:55:09.987438 [INFO] consul: Handled member-join event for server "Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d.dc1" in area "wan"
TestBootstrapCommand - 2019/12/30 18:55:09.990523 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestBootstrapCommand - 2019/12/30 18:55:09.991348 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestBootstrapCommand - 2019/12/30 18:55:09.993779 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestBootstrapCommand - 2019/12/30 18:55:09.993935 [INFO] agent: started state syncer
2019/12/30 18:55:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:10 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:10 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:10 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestBootstrapCommand - 2019/12/30 18:55:10.950051 [INFO] consul: cluster leadership acquired
TestBootstrapCommand - 2019/12/30 18:55:10.950673 [INFO] consul: New leader elected: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d
TestBootstrapCommand - 2019/12/30 18:55:11.002281 [ERR] agent: failed to sync remote state: ACL not found
TestBootstrapCommand - 2019/12/30 18:55:11.336662 [INFO] acl: initializing acls
TestBootstrapCommand - 2019/12/30 18:55:11.539022 [INFO] acl: initializing acls
TestBootstrapCommand - 2019/12/30 18:55:11.622553 [INFO] consul: Created ACL 'global-management' policy
TestBootstrapCommand - 2019/12/30 18:55:12.128917 [ERR] agent: failed to sync remote state: ACL not found
TestBootstrapCommand - 2019/12/30 18:55:12.394826 [INFO] consul: Created ACL 'global-management' policy
TestBootstrapCommand - 2019/12/30 18:55:12.395640 [INFO] consul: Created ACL anonymous token from configuration
TestBootstrapCommand - 2019/12/30 18:55:12.396914 [INFO] serf: EventMemberUpdate: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d
TestBootstrapCommand - 2019/12/30 18:55:12.397684 [INFO] serf: EventMemberUpdate: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d.dc1
TestBootstrapCommand - 2019/12/30 18:55:13.018884 [INFO] consul: Created ACL anonymous token from configuration
TestBootstrapCommand - 2019/12/30 18:55:13.018987 [DEBUG] acl: transitioning out of legacy ACL mode
TestBootstrapCommand - 2019/12/30 18:55:13.021585 [INFO] serf: EventMemberUpdate: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d
TestBootstrapCommand - 2019/12/30 18:55:13.028394 [INFO] serf: EventMemberUpdate: Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d.dc1
TestBootstrapCommand - 2019/12/30 18:55:14.059122 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBootstrapCommand - 2019/12/30 18:55:14.060727 [DEBUG] consul: Skipping self join check for "Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d" since the cluster is too small
TestBootstrapCommand - 2019/12/30 18:55:14.060933 [INFO] consul: member 'Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d' joined, marking health alive
TestBootstrapCommand - 2019/12/30 18:55:14.272166 [DEBUG] consul: Skipping self join check for "Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d" since the cluster is too small
TestBootstrapCommand - 2019/12/30 18:55:14.272862 [DEBUG] consul: Skipping self join check for "Node d5d9ff95-0e95-dff3-eb94-ca33d6f7f72d" since the cluster is too small
TestBootstrapCommand - 2019/12/30 18:55:14.282079 [WARN] acl.bootstrap: failed to remove bootstrap file: remove /tmp/TestBootstrapCommand-agent164556249/acl-bootstrap-reset: no such file or directory
TestBootstrapCommand - 2019/12/30 18:55:14.493330 [INFO] consul.acl: ACL bootstrap completed
TestBootstrapCommand - 2019/12/30 18:55:14.497566 [DEBUG] http: Request PUT /v1/acl/bootstrap (216.008707ms) from=127.0.0.1:52066
TestBootstrapCommand - 2019/12/30 18:55:14.505641 [INFO] agent: Requesting shutdown
TestBootstrapCommand - 2019/12/30 18:55:14.505756 [INFO] consul: shutting down server
TestBootstrapCommand - 2019/12/30 18:55:14.505809 [WARN] serf: Shutdown without a Leave
TestBootstrapCommand - 2019/12/30 18:55:14.717369 [WARN] serf: Shutdown without a Leave
TestBootstrapCommand - 2019/12/30 18:55:14.820603 [INFO] manager: shutting down
TestBootstrapCommand - 2019/12/30 18:55:14.821034 [INFO] agent: consul server down
TestBootstrapCommand - 2019/12/30 18:55:14.821084 [INFO] agent: shutdown complete
TestBootstrapCommand - 2019/12/30 18:55:14.821139 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestBootstrapCommand - 2019/12/30 18:55:14.821525 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestBootstrapCommand - 2019/12/30 18:55:14.821731 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestBootstrapCommand - 2019/12/30 18:55:14.822215 [INFO] agent: Waiting for endpoints to shut down
TestBootstrapCommand - 2019/12/30 18:55:14.822587 [INFO] agent: Endpoints down
--- PASS: TestBootstrapCommand (6.62s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bootstrap	6.893s
?   	github.com/hashicorp/consul/command/acl/policy	[no test files]
=== RUN   TestPolicyCreateCommand_noTabs
=== PAUSE TestPolicyCreateCommand_noTabs
=== RUN   TestPolicyCreateCommand
=== PAUSE TestPolicyCreateCommand
=== CONT  TestPolicyCreateCommand_noTabs
=== CONT  TestPolicyCreateCommand
--- PASS: TestPolicyCreateCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyCreateCommand - 2019/12/30 18:55:32.780928 [WARN] agent: Node name "Node 6943924d-9889-63c0-5c4c-4afb85b6061b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyCreateCommand - 2019/12/30 18:55:32.781935 [DEBUG] tlsutil: Update with version 1
TestPolicyCreateCommand - 2019/12/30 18:55:32.798327 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:55:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6943924d-9889-63c0-5c4c-4afb85b6061b Address:127.0.0.1:47506}]
2019/12/30 18:55:33 [INFO]  raft: Node at 127.0.0.1:47506 [Follower] entering Follower state (Leader: "")
TestPolicyCreateCommand - 2019/12/30 18:55:33.964073 [INFO] serf: EventMemberJoin: Node 6943924d-9889-63c0-5c4c-4afb85b6061b.dc1 127.0.0.1
TestPolicyCreateCommand - 2019/12/30 18:55:33.969582 [INFO] serf: EventMemberJoin: Node 6943924d-9889-63c0-5c4c-4afb85b6061b 127.0.0.1
TestPolicyCreateCommand - 2019/12/30 18:55:33.973217 [INFO] agent: Started DNS server 127.0.0.1:47501 (udp)
TestPolicyCreateCommand - 2019/12/30 18:55:33.973502 [INFO] consul: Handled member-join event for server "Node 6943924d-9889-63c0-5c4c-4afb85b6061b.dc1" in area "wan"
TestPolicyCreateCommand - 2019/12/30 18:55:33.973690 [INFO] agent: Started DNS server 127.0.0.1:47501 (tcp)
TestPolicyCreateCommand - 2019/12/30 18:55:33.974068 [INFO] consul: Adding LAN server Node 6943924d-9889-63c0-5c4c-4afb85b6061b (Addr: tcp/127.0.0.1:47506) (DC: dc1)
TestPolicyCreateCommand - 2019/12/30 18:55:33.979572 [INFO] agent: Started HTTP server on 127.0.0.1:47502 (tcp)
TestPolicyCreateCommand - 2019/12/30 18:55:33.980036 [INFO] agent: started state syncer
2019/12/30 18:55:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:55:33 [INFO]  raft: Node at 127.0.0.1:47506 [Candidate] entering Candidate state in term 2
2019/12/30 18:55:36 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:55:36 [INFO]  raft: Node at 127.0.0.1:47506 [Leader] entering Leader state
TestPolicyCreateCommand - 2019/12/30 18:55:36.134017 [INFO] consul: cluster leadership acquired
TestPolicyCreateCommand - 2019/12/30 18:55:36.134674 [INFO] consul: New leader elected: Node 6943924d-9889-63c0-5c4c-4afb85b6061b
TestPolicyCreateCommand - 2019/12/30 18:55:36.149520 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyCreateCommand - 2019/12/30 18:55:36.550881 [INFO] acl: initializing acls
TestPolicyCreateCommand - 2019/12/30 18:55:36.743010 [INFO] consul: Created ACL 'global-management' policy
TestPolicyCreateCommand - 2019/12/30 18:55:36.743108 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyCreateCommand - 2019/12/30 18:55:37.038853 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyCreateCommand - 2019/12/30 18:55:37.127213 [INFO] acl: initializing acls
TestPolicyCreateCommand - 2019/12/30 18:55:37.127348 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyCreateCommand - 2019/12/30 18:55:37.431826 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyCreateCommand - 2019/12/30 18:55:37.756299 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyCreateCommand - 2019/12/30 18:55:37.757015 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyCreateCommand - 2019/12/30 18:55:37.757069 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyCreateCommand - 2019/12/30 18:55:37.757088 [INFO] serf: EventMemberUpdate: Node 6943924d-9889-63c0-5c4c-4afb85b6061b
TestPolicyCreateCommand - 2019/12/30 18:55:37.757672 [INFO] serf: EventMemberUpdate: Node 6943924d-9889-63c0-5c4c-4afb85b6061b.dc1
TestPolicyCreateCommand - 2019/12/30 18:55:37.758427 [INFO] serf: EventMemberUpdate: Node 6943924d-9889-63c0-5c4c-4afb85b6061b
TestPolicyCreateCommand - 2019/12/30 18:55:37.759105 [INFO] serf: EventMemberUpdate: Node 6943924d-9889-63c0-5c4c-4afb85b6061b.dc1
TestPolicyCreateCommand - 2019/12/30 18:55:39.060982 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyCreateCommand - 2019/12/30 18:55:39.061480 [DEBUG] consul: Skipping self join check for "Node 6943924d-9889-63c0-5c4c-4afb85b6061b" since the cluster is too small
TestPolicyCreateCommand - 2019/12/30 18:55:39.061599 [INFO] consul: member 'Node 6943924d-9889-63c0-5c4c-4afb85b6061b' joined, marking health alive
TestPolicyCreateCommand - 2019/12/30 18:55:39.365414 [DEBUG] consul: Skipping self join check for "Node 6943924d-9889-63c0-5c4c-4afb85b6061b" since the cluster is too small
TestPolicyCreateCommand - 2019/12/30 18:55:39.366216 [DEBUG] consul: Skipping self join check for "Node 6943924d-9889-63c0-5c4c-4afb85b6061b" since the cluster is too small
TestPolicyCreateCommand - 2019/12/30 18:55:39.752011 [DEBUG] http: Request PUT /v1/acl/policy (373.505531ms) from=127.0.0.1:40516
TestPolicyCreateCommand - 2019/12/30 18:55:39.763844 [INFO] agent: Requesting shutdown
TestPolicyCreateCommand - 2019/12/30 18:55:39.763939 [INFO] consul: shutting down server
TestPolicyCreateCommand - 2019/12/30 18:55:39.763984 [WARN] serf: Shutdown without a Leave
TestPolicyCreateCommand - 2019/12/30 18:55:39.934327 [WARN] serf: Shutdown without a Leave
TestPolicyCreateCommand - 2019/12/30 18:55:40.033732 [INFO] manager: shutting down
TestPolicyCreateCommand - 2019/12/30 18:55:40.034289 [INFO] agent: consul server down
TestPolicyCreateCommand - 2019/12/30 18:55:40.034345 [INFO] agent: shutdown complete
TestPolicyCreateCommand - 2019/12/30 18:55:40.034462 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (tcp)
TestPolicyCreateCommand - 2019/12/30 18:55:40.034605 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (udp)
TestPolicyCreateCommand - 2019/12/30 18:55:40.034775 [INFO] agent: Stopping HTTP server 127.0.0.1:47502 (tcp)
TestPolicyCreateCommand - 2019/12/30 18:55:40.035204 [INFO] agent: Waiting for endpoints to shut down
TestPolicyCreateCommand - 2019/12/30 18:55:40.035387 [INFO] agent: Endpoints down
--- PASS: TestPolicyCreateCommand (7.33s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/create	7.595s
=== RUN   TestPolicyDeleteCommand_noTabs
=== PAUSE TestPolicyDeleteCommand_noTabs
=== RUN   TestPolicyDeleteCommand
=== PAUSE TestPolicyDeleteCommand
=== CONT  TestPolicyDeleteCommand_noTabs
=== CONT  TestPolicyDeleteCommand
--- PASS: TestPolicyDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyDeleteCommand - 2019/12/30 18:56:01.147903 [WARN] agent: Node name "Node f0b52932-41a0-9d3a-05b1-19aa77c754dd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyDeleteCommand - 2019/12/30 18:56:01.148927 [DEBUG] tlsutil: Update with version 1
TestPolicyDeleteCommand - 2019/12/30 18:56:01.156762 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f0b52932-41a0-9d3a-05b1-19aa77c754dd Address:127.0.0.1:10006}]
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestPolicyDeleteCommand - 2019/12/30 18:56:02.479130 [INFO] serf: EventMemberJoin: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd.dc1 127.0.0.1
TestPolicyDeleteCommand - 2019/12/30 18:56:02.483079 [INFO] serf: EventMemberJoin: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd 127.0.0.1
TestPolicyDeleteCommand - 2019/12/30 18:56:02.484551 [INFO] consul: Handled member-join event for server "Node f0b52932-41a0-9d3a-05b1-19aa77c754dd.dc1" in area "wan"
TestPolicyDeleteCommand - 2019/12/30 18:56:02.485029 [INFO] consul: Adding LAN server Node f0b52932-41a0-9d3a-05b1-19aa77c754dd (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestPolicyDeleteCommand - 2019/12/30 18:56:02.485502 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestPolicyDeleteCommand - 2019/12/30 18:56:02.485668 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestPolicyDeleteCommand - 2019/12/30 18:56:02.490798 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestPolicyDeleteCommand - 2019/12/30 18:56:02.491085 [INFO] agent: started state syncer
2019/12/30 18:56:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:02 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:03 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:03 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestPolicyDeleteCommand - 2019/12/30 18:56:03.305527 [INFO] consul: cluster leadership acquired
TestPolicyDeleteCommand - 2019/12/30 18:56:03.306105 [INFO] consul: New leader elected: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd
TestPolicyDeleteCommand - 2019/12/30 18:56:03.471541 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyDeleteCommand - 2019/12/30 18:56:03.737871 [INFO] acl: initializing acls
TestPolicyDeleteCommand - 2019/12/30 18:56:03.936118 [INFO] consul: Created ACL 'global-management' policy
TestPolicyDeleteCommand - 2019/12/30 18:56:03.936200 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyDeleteCommand - 2019/12/30 18:56:04.036966 [INFO] acl: initializing acls
TestPolicyDeleteCommand - 2019/12/30 18:56:04.037112 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyDeleteCommand - 2019/12/30 18:56:04.619512 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyDeleteCommand - 2019/12/30 18:56:04.622419 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyDeleteCommand - 2019/12/30 18:56:05.178436 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyDeleteCommand - 2019/12/30 18:56:05.179562 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyDeleteCommand - 2019/12/30 18:56:05.179634 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyDeleteCommand - 2019/12/30 18:56:05.180213 [INFO] serf: EventMemberUpdate: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd
TestPolicyDeleteCommand - 2019/12/30 18:56:05.181082 [INFO] serf: EventMemberUpdate: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd.dc1
TestPolicyDeleteCommand - 2019/12/30 18:56:05.181379 [INFO] serf: EventMemberUpdate: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd
TestPolicyDeleteCommand - 2019/12/30 18:56:05.182568 [INFO] serf: EventMemberUpdate: Node f0b52932-41a0-9d3a-05b1-19aa77c754dd.dc1
TestPolicyDeleteCommand - 2019/12/30 18:56:06.460700 [INFO] agent: Synced node info
TestPolicyDeleteCommand - 2019/12/30 18:56:06.460808 [DEBUG] agent: Node info in sync
TestPolicyDeleteCommand - 2019/12/30 18:56:06.460906 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyDeleteCommand - 2019/12/30 18:56:06.461385 [DEBUG] consul: Skipping self join check for "Node f0b52932-41a0-9d3a-05b1-19aa77c754dd" since the cluster is too small
TestPolicyDeleteCommand - 2019/12/30 18:56:06.461538 [INFO] consul: member 'Node f0b52932-41a0-9d3a-05b1-19aa77c754dd' joined, marking health alive
TestPolicyDeleteCommand - 2019/12/30 18:56:06.857916 [DEBUG] consul: Skipping self join check for "Node f0b52932-41a0-9d3a-05b1-19aa77c754dd" since the cluster is too small
TestPolicyDeleteCommand - 2019/12/30 18:56:06.860120 [DEBUG] consul: Skipping self join check for "Node f0b52932-41a0-9d3a-05b1-19aa77c754dd" since the cluster is too small
TestPolicyDeleteCommand - 2019/12/30 18:56:06.860595 [DEBUG] http: Request PUT /v1/acl/policy (386.907214ms) from=127.0.0.1:52072
TestPolicyDeleteCommand - 2019/12/30 18:56:07.134624 [DEBUG] http: Request DELETE /v1/acl/policy/bf6cd08b-7bf0-ce80-4d0a-78aaf92c1ebe (268.662759ms) from=127.0.0.1:52074
TestPolicyDeleteCommand - 2019/12/30 18:56:07.137704 [ERR] http: Request GET /v1/acl/policy/bf6cd08b-7bf0-ce80-4d0a-78aaf92c1ebe, error: ACL not found from=127.0.0.1:52072
TestPolicyDeleteCommand - 2019/12/30 18:56:07.146963 [DEBUG] http: Request GET /v1/acl/policy/bf6cd08b-7bf0-ce80-4d0a-78aaf92c1ebe (9.715923ms) from=127.0.0.1:52072
TestPolicyDeleteCommand - 2019/12/30 18:56:07.152775 [INFO] agent: Requesting shutdown
TestPolicyDeleteCommand - 2019/12/30 18:56:07.152874 [INFO] consul: shutting down server
TestPolicyDeleteCommand - 2019/12/30 18:56:07.152925 [WARN] serf: Shutdown without a Leave
TestPolicyDeleteCommand - 2019/12/30 18:56:07.325995 [WARN] serf: Shutdown without a Leave
TestPolicyDeleteCommand - 2019/12/30 18:56:07.401088 [INFO] manager: shutting down
TestPolicyDeleteCommand - 2019/12/30 18:56:07.401710 [INFO] agent: consul server down
TestPolicyDeleteCommand - 2019/12/30 18:56:07.401782 [INFO] agent: shutdown complete
TestPolicyDeleteCommand - 2019/12/30 18:56:07.401849 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestPolicyDeleteCommand - 2019/12/30 18:56:07.402018 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestPolicyDeleteCommand - 2019/12/30 18:56:07.402202 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestPolicyDeleteCommand - 2019/12/30 18:56:07.402757 [INFO] agent: Waiting for endpoints to shut down
TestPolicyDeleteCommand - 2019/12/30 18:56:07.402848 [INFO] agent: Endpoints down
--- PASS: TestPolicyDeleteCommand (6.35s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/delete	6.849s
=== RUN   TestPolicyListCommand_noTabs
=== PAUSE TestPolicyListCommand_noTabs
=== RUN   TestPolicyListCommand
=== PAUSE TestPolicyListCommand
=== CONT  TestPolicyListCommand_noTabs
=== CONT  TestPolicyListCommand
--- PASS: TestPolicyListCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyListCommand - 2019/12/30 18:56:03.062358 [WARN] agent: Node name "Node f03dce68-46b7-836b-7989-c467be199afe" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyListCommand - 2019/12/30 18:56:03.063019 [DEBUG] tlsutil: Update with version 1
TestPolicyListCommand - 2019/12/30 18:56:03.070328 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f03dce68-46b7-836b-7989-c467be199afe Address:127.0.0.1:25006}]
2019/12/30 18:56:04 [INFO]  raft: Node at 127.0.0.1:25006 [Follower] entering Follower state (Leader: "")
TestPolicyListCommand - 2019/12/30 18:56:04.333470 [INFO] serf: EventMemberJoin: Node f03dce68-46b7-836b-7989-c467be199afe.dc1 127.0.0.1
TestPolicyListCommand - 2019/12/30 18:56:04.338657 [INFO] serf: EventMemberJoin: Node f03dce68-46b7-836b-7989-c467be199afe 127.0.0.1
TestPolicyListCommand - 2019/12/30 18:56:04.340608 [INFO] consul: Adding LAN server Node f03dce68-46b7-836b-7989-c467be199afe (Addr: tcp/127.0.0.1:25006) (DC: dc1)
TestPolicyListCommand - 2019/12/30 18:56:04.340812 [INFO] consul: Handled member-join event for server "Node f03dce68-46b7-836b-7989-c467be199afe.dc1" in area "wan"
TestPolicyListCommand - 2019/12/30 18:56:04.346724 [INFO] agent: Started DNS server 127.0.0.1:25001 (udp)
TestPolicyListCommand - 2019/12/30 18:56:04.346816 [INFO] agent: Started DNS server 127.0.0.1:25001 (tcp)
TestPolicyListCommand - 2019/12/30 18:56:04.349439 [INFO] agent: Started HTTP server on 127.0.0.1:25002 (tcp)
TestPolicyListCommand - 2019/12/30 18:56:04.349587 [INFO] agent: started state syncer
2019/12/30 18:56:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:04 [INFO]  raft: Node at 127.0.0.1:25006 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:05 [INFO]  raft: Node at 127.0.0.1:25006 [Leader] entering Leader state
TestPolicyListCommand - 2019/12/30 18:56:05.178672 [INFO] consul: cluster leadership acquired
TestPolicyListCommand - 2019/12/30 18:56:05.179318 [INFO] consul: New leader elected: Node f03dce68-46b7-836b-7989-c467be199afe
TestPolicyListCommand - 2019/12/30 18:56:05.363765 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyListCommand - 2019/12/30 18:56:05.507293 [INFO] acl: initializing acls
TestPolicyListCommand - 2019/12/30 18:56:05.751762 [INFO] consul: Created ACL 'global-management' policy
TestPolicyListCommand - 2019/12/30 18:56:05.751834 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyListCommand - 2019/12/30 18:56:05.897811 [INFO] acl: initializing acls
TestPolicyListCommand - 2019/12/30 18:56:05.898171 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyListCommand - 2019/12/30 18:56:06.171340 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyListCommand - 2019/12/30 18:56:06.171549 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyListCommand - 2019/12/30 18:56:06.369225 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyListCommand - 2019/12/30 18:56:06.370777 [INFO] serf: EventMemberUpdate: Node f03dce68-46b7-836b-7989-c467be199afe
TestPolicyListCommand - 2019/12/30 18:56:06.371617 [INFO] serf: EventMemberUpdate: Node f03dce68-46b7-836b-7989-c467be199afe.dc1
TestPolicyListCommand - 2019/12/30 18:56:06.549297 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyListCommand - 2019/12/30 18:56:06.549868 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyListCommand - 2019/12/30 18:56:06.551390 [INFO] serf: EventMemberUpdate: Node f03dce68-46b7-836b-7989-c467be199afe
TestPolicyListCommand - 2019/12/30 18:56:06.552211 [INFO] serf: EventMemberUpdate: Node f03dce68-46b7-836b-7989-c467be199afe.dc1
TestPolicyListCommand - 2019/12/30 18:56:09.067999 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyListCommand - 2019/12/30 18:56:09.070654 [INFO] agent: Synced node info
TestPolicyListCommand - 2019/12/30 18:56:09.070791 [DEBUG] agent: Node info in sync
TestPolicyListCommand - 2019/12/30 18:56:09.070891 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyListCommand - 2019/12/30 18:56:09.071395 [DEBUG] consul: Skipping self join check for "Node f03dce68-46b7-836b-7989-c467be199afe" since the cluster is too small
TestPolicyListCommand - 2019/12/30 18:56:09.071562 [INFO] consul: member 'Node f03dce68-46b7-836b-7989-c467be199afe' joined, marking health alive
TestPolicyListCommand - 2019/12/30 18:56:09.621205 [DEBUG] consul: Skipping self join check for "Node f03dce68-46b7-836b-7989-c467be199afe" since the cluster is too small
TestPolicyListCommand - 2019/12/30 18:56:09.621693 [DEBUG] consul: Skipping self join check for "Node f03dce68-46b7-836b-7989-c467be199afe" since the cluster is too small
TestPolicyListCommand - 2019/12/30 18:56:09.622726 [DEBUG] http: Request PUT /v1/acl/policy (525.221864ms) from=127.0.0.1:55762
TestPolicyListCommand - 2019/12/30 18:56:09.936733 [DEBUG] http: Request PUT /v1/acl/policy (309.548838ms) from=127.0.0.1:55762
TestPolicyListCommand - 2019/12/30 18:56:10.395331 [DEBUG] http: Request PUT /v1/acl/policy (455.666027ms) from=127.0.0.1:55762
TestPolicyListCommand - 2019/12/30 18:56:10.769724 [DEBUG] http: Request PUT /v1/acl/policy (362.91058ms) from=127.0.0.1:55762
TestPolicyListCommand - 2019/12/30 18:56:11.136706 [DEBUG] http: Request PUT /v1/acl/policy (362.82191ms) from=127.0.0.1:55762
TestPolicyListCommand - 2019/12/30 18:56:11.152988 [DEBUG] http: Request GET /v1/acl/policies (11.84798ms) from=127.0.0.1:55764
TestPolicyListCommand - 2019/12/30 18:56:11.159340 [INFO] agent: Requesting shutdown
TestPolicyListCommand - 2019/12/30 18:56:11.159516 [INFO] consul: shutting down server
TestPolicyListCommand - 2019/12/30 18:56:11.159580 [WARN] serf: Shutdown without a Leave
TestPolicyListCommand - 2019/12/30 18:56:11.251043 [WARN] serf: Shutdown without a Leave
TestPolicyListCommand - 2019/12/30 18:56:11.376169 [INFO] manager: shutting down
TestPolicyListCommand - 2019/12/30 18:56:11.377709 [INFO] agent: consul server down
TestPolicyListCommand - 2019/12/30 18:56:11.377829 [INFO] agent: shutdown complete
TestPolicyListCommand - 2019/12/30 18:56:11.378124 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (tcp)
TestPolicyListCommand - 2019/12/30 18:56:11.378475 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (udp)
TestPolicyListCommand - 2019/12/30 18:56:11.378829 [INFO] agent: Stopping HTTP server 127.0.0.1:25002 (tcp)
TestPolicyListCommand - 2019/12/30 18:56:11.380149 [INFO] agent: Waiting for endpoints to shut down
TestPolicyListCommand - 2019/12/30 18:56:11.380282 [INFO] agent: Endpoints down
--- PASS: TestPolicyListCommand (8.39s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/list	8.652s
=== RUN   TestPolicyReadCommand_noTabs
=== PAUSE TestPolicyReadCommand_noTabs
=== RUN   TestPolicyReadCommand
=== PAUSE TestPolicyReadCommand
=== CONT  TestPolicyReadCommand_noTabs
=== CONT  TestPolicyReadCommand
--- PASS: TestPolicyReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyReadCommand - 2019/12/30 18:56:28.235376 [WARN] agent: Node name "Node 8f3e2165-4371-6d72-a236-5f667e502689" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyReadCommand - 2019/12/30 18:56:28.236502 [DEBUG] tlsutil: Update with version 1
TestPolicyReadCommand - 2019/12/30 18:56:28.243895 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8f3e2165-4371-6d72-a236-5f667e502689 Address:127.0.0.1:11506}]
2019/12/30 18:56:29 [INFO]  raft: Node at 127.0.0.1:11506 [Follower] entering Follower state (Leader: "")
TestPolicyReadCommand - 2019/12/30 18:56:29.181585 [INFO] serf: EventMemberJoin: Node 8f3e2165-4371-6d72-a236-5f667e502689.dc1 127.0.0.1
TestPolicyReadCommand - 2019/12/30 18:56:29.203779 [INFO] serf: EventMemberJoin: Node 8f3e2165-4371-6d72-a236-5f667e502689 127.0.0.1
TestPolicyReadCommand - 2019/12/30 18:56:29.208973 [INFO] consul: Adding LAN server Node 8f3e2165-4371-6d72-a236-5f667e502689 (Addr: tcp/127.0.0.1:11506) (DC: dc1)
TestPolicyReadCommand - 2019/12/30 18:56:29.218884 [INFO] consul: Handled member-join event for server "Node 8f3e2165-4371-6d72-a236-5f667e502689.dc1" in area "wan"
2019/12/30 18:56:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:29 [INFO]  raft: Node at 127.0.0.1:11506 [Candidate] entering Candidate state in term 2
TestPolicyReadCommand - 2019/12/30 18:56:29.223159 [INFO] agent: Started DNS server 127.0.0.1:11501 (tcp)
TestPolicyReadCommand - 2019/12/30 18:56:29.223839 [INFO] agent: Started DNS server 127.0.0.1:11501 (udp)
TestPolicyReadCommand - 2019/12/30 18:56:29.226461 [INFO] agent: Started HTTP server on 127.0.0.1:11502 (tcp)
TestPolicyReadCommand - 2019/12/30 18:56:29.227081 [INFO] agent: started state syncer
2019/12/30 18:56:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:29 [INFO]  raft: Node at 127.0.0.1:11506 [Leader] entering Leader state
TestPolicyReadCommand - 2019/12/30 18:56:29.928493 [INFO] consul: cluster leadership acquired
TestPolicyReadCommand - 2019/12/30 18:56:29.929148 [INFO] consul: New leader elected: Node 8f3e2165-4371-6d72-a236-5f667e502689
TestPolicyReadCommand - 2019/12/30 18:56:29.972002 [INFO] acl: initializing acls
TestPolicyReadCommand - 2019/12/30 18:56:29.977598 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyReadCommand - 2019/12/30 18:56:30.277391 [INFO] acl: initializing acls
TestPolicyReadCommand - 2019/12/30 18:56:30.295252 [INFO] consul: Created ACL 'global-management' policy
TestPolicyReadCommand - 2019/12/30 18:56:30.295622 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyReadCommand - 2019/12/30 18:56:30.705781 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyReadCommand - 2019/12/30 18:56:30.708939 [INFO] consul: Created ACL 'global-management' policy
TestPolicyReadCommand - 2019/12/30 18:56:30.709032 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyReadCommand - 2019/12/30 18:56:30.870925 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyReadCommand - 2019/12/30 18:56:30.871044 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyReadCommand - 2019/12/30 18:56:30.871892 [INFO] serf: EventMemberUpdate: Node 8f3e2165-4371-6d72-a236-5f667e502689
TestPolicyReadCommand - 2019/12/30 18:56:30.872646 [INFO] serf: EventMemberUpdate: Node 8f3e2165-4371-6d72-a236-5f667e502689.dc1
TestPolicyReadCommand - 2019/12/30 18:56:31.161161 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyReadCommand - 2019/12/30 18:56:31.162033 [INFO] serf: EventMemberUpdate: Node 8f3e2165-4371-6d72-a236-5f667e502689
TestPolicyReadCommand - 2019/12/30 18:56:31.162656 [INFO] serf: EventMemberUpdate: Node 8f3e2165-4371-6d72-a236-5f667e502689.dc1
TestPolicyReadCommand - 2019/12/30 18:56:32.288591 [INFO] agent: Synced node info
TestPolicyReadCommand - 2019/12/30 18:56:32.288711 [DEBUG] agent: Node info in sync
TestPolicyReadCommand - 2019/12/30 18:56:32.687899 [DEBUG] http: Request PUT /v1/acl/policy (364.706623ms) from=127.0.0.1:56270
TestPolicyReadCommand - 2019/12/30 18:56:32.702895 [DEBUG] http: Request GET /v1/acl/policy/8e4c3b5a-f636-8f5f-03d8-5f41b23fd318 (6.95385ms) from=127.0.0.1:56272
TestPolicyReadCommand - 2019/12/30 18:56:32.707115 [INFO] agent: Requesting shutdown
TestPolicyReadCommand - 2019/12/30 18:56:32.707226 [INFO] consul: shutting down server
TestPolicyReadCommand - 2019/12/30 18:56:32.707476 [WARN] serf: Shutdown without a Leave
TestPolicyReadCommand - 2019/12/30 18:56:32.818638 [WARN] serf: Shutdown without a Leave
TestPolicyReadCommand - 2019/12/30 18:56:32.910796 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyReadCommand - 2019/12/30 18:56:32.911460 [DEBUG] consul: Skipping self join check for "Node 8f3e2165-4371-6d72-a236-5f667e502689" since the cluster is too small
TestPolicyReadCommand - 2019/12/30 18:56:32.911630 [INFO] consul: member 'Node 8f3e2165-4371-6d72-a236-5f667e502689' joined, marking health alive
TestPolicyReadCommand - 2019/12/30 18:56:32.914735 [INFO] manager: shutting down
TestPolicyReadCommand - 2019/12/30 18:56:33.110203 [INFO] agent: consul server down
TestPolicyReadCommand - 2019/12/30 18:56:33.110318 [INFO] agent: shutdown complete
TestPolicyReadCommand - 2019/12/30 18:56:33.110435 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (tcp)
TestPolicyReadCommand - 2019/12/30 18:56:33.110601 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (udp)
TestPolicyReadCommand - 2019/12/30 18:56:33.110787 [INFO] agent: Stopping HTTP server 127.0.0.1:11502 (tcp)
TestPolicyReadCommand - 2019/12/30 18:56:33.111552 [INFO] agent: Waiting for endpoints to shut down
TestPolicyReadCommand - 2019/12/30 18:56:33.111935 [ERR] consul: failed to reconcile member: {Node 8f3e2165-4371-6d72-a236-5f667e502689 127.0.0.1 11504 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:8f3e2165-4371-6d72-a236-5f667e502689 port:11506 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:11505] alive 1 5 2 2 5 4}: leadership lost while committing log
TestPolicyReadCommand - 2019/12/30 18:56:33.112291 [INFO] agent: Endpoints down
--- PASS: TestPolicyReadCommand (4.97s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/read	5.236s
=== RUN   TestPolicyUpdateCommand_noTabs
=== PAUSE TestPolicyUpdateCommand_noTabs
=== RUN   TestPolicyUpdateCommand
=== PAUSE TestPolicyUpdateCommand
=== CONT  TestPolicyUpdateCommand_noTabs
=== CONT  TestPolicyUpdateCommand
--- PASS: TestPolicyUpdateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyUpdateCommand - 2019/12/30 18:56:57.320688 [WARN] agent: Node name "Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyUpdateCommand - 2019/12/30 18:56:57.321791 [DEBUG] tlsutil: Update with version 1
TestPolicyUpdateCommand - 2019/12/30 18:56:57.344623 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:56:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:92b301f7-6f96-9fc0-a974-f4328ea92d0e Address:127.0.0.1:23506}]
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestPolicyUpdateCommand - 2019/12/30 18:56:58.545731 [INFO] serf: EventMemberJoin: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e.dc1 127.0.0.1
TestPolicyUpdateCommand - 2019/12/30 18:56:58.550436 [INFO] serf: EventMemberJoin: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e 127.0.0.1
TestPolicyUpdateCommand - 2019/12/30 18:56:58.551231 [INFO] consul: Adding LAN server Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestPolicyUpdateCommand - 2019/12/30 18:56:58.551915 [INFO] consul: Handled member-join event for server "Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e.dc1" in area "wan"
TestPolicyUpdateCommand - 2019/12/30 18:56:58.552408 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestPolicyUpdateCommand - 2019/12/30 18:56:58.552713 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestPolicyUpdateCommand - 2019/12/30 18:56:58.555143 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestPolicyUpdateCommand - 2019/12/30 18:56:58.555269 [INFO] agent: started state syncer
2019/12/30 18:56:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:56:58 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 18:56:59 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:56:59 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestPolicyUpdateCommand - 2019/12/30 18:56:59.278328 [INFO] consul: cluster leadership acquired
TestPolicyUpdateCommand - 2019/12/30 18:56:59.278849 [INFO] consul: New leader elected: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e
TestPolicyUpdateCommand - 2019/12/30 18:56:59.303425 [INFO] acl: initializing acls
TestPolicyUpdateCommand - 2019/12/30 18:56:59.526312 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyUpdateCommand - 2019/12/30 18:56:59.619529 [INFO] acl: initializing acls
TestPolicyUpdateCommand - 2019/12/30 18:56:59.620109 [INFO] consul: Created ACL 'global-management' policy
TestPolicyUpdateCommand - 2019/12/30 18:56:59.620181 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyUpdateCommand - 2019/12/30 18:56:59.821343 [INFO] consul: Created ACL 'global-management' policy
TestPolicyUpdateCommand - 2019/12/30 18:56:59.821452 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyUpdateCommand - 2019/12/30 18:57:00.397127 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyUpdateCommand - 2019/12/30 18:57:00.400854 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyUpdateCommand - 2019/12/30 18:57:00.643644 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyUpdateCommand - 2019/12/30 18:57:00.787934 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyUpdateCommand - 2019/12/30 18:57:00.788201 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyUpdateCommand - 2019/12/30 18:57:00.788256 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyUpdateCommand - 2019/12/30 18:57:00.789171 [INFO] serf: EventMemberUpdate: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e
TestPolicyUpdateCommand - 2019/12/30 18:57:00.789933 [INFO] serf: EventMemberUpdate: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e.dc1
TestPolicyUpdateCommand - 2019/12/30 18:57:00.795213 [INFO] serf: EventMemberUpdate: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e
TestPolicyUpdateCommand - 2019/12/30 18:57:00.796331 [INFO] serf: EventMemberUpdate: Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e.dc1
TestPolicyUpdateCommand - 2019/12/30 18:57:02.004466 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyUpdateCommand - 2019/12/30 18:57:02.005024 [DEBUG] consul: Skipping self join check for "Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e" since the cluster is too small
TestPolicyUpdateCommand - 2019/12/30 18:57:02.005147 [INFO] consul: member 'Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e' joined, marking health alive
TestPolicyUpdateCommand - 2019/12/30 18:57:02.212987 [DEBUG] consul: Skipping self join check for "Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e" since the cluster is too small
TestPolicyUpdateCommand - 2019/12/30 18:57:02.213705 [DEBUG] consul: Skipping self join check for "Node 92b301f7-6f96-9fc0-a974-f4328ea92d0e" since the cluster is too small
TestPolicyUpdateCommand - 2019/12/30 18:57:02.446057 [DEBUG] http: Request PUT /v1/acl/policy (208.751506ms) from=127.0.0.1:37680
TestPolicyUpdateCommand - 2019/12/30 18:57:02.453353 [DEBUG] http: Request GET /v1/acl/policy/b3e763d9-7d5c-6409-ffd2-192a9d5829c2 (1.675711ms) from=127.0.0.1:37682
TestPolicyUpdateCommand - 2019/12/30 18:57:02.782944 [DEBUG] http: Request PUT /v1/acl/policy/b3e763d9-7d5c-6409-ffd2-192a9d5829c2 (326.338609ms) from=127.0.0.1:37682
TestPolicyUpdateCommand - 2019/12/30 18:57:02.786279 [INFO] agent: Requesting shutdown
TestPolicyUpdateCommand - 2019/12/30 18:57:02.786376 [INFO] consul: shutting down server
TestPolicyUpdateCommand - 2019/12/30 18:57:02.786559 [WARN] serf: Shutdown without a Leave
TestPolicyUpdateCommand - 2019/12/30 18:57:02.867856 [WARN] serf: Shutdown without a Leave
TestPolicyUpdateCommand - 2019/12/30 18:57:02.969558 [INFO] manager: shutting down
TestPolicyUpdateCommand - 2019/12/30 18:57:02.970096 [INFO] agent: consul server down
TestPolicyUpdateCommand - 2019/12/30 18:57:02.970159 [INFO] agent: shutdown complete
TestPolicyUpdateCommand - 2019/12/30 18:57:02.970217 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestPolicyUpdateCommand - 2019/12/30 18:57:02.970575 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestPolicyUpdateCommand - 2019/12/30 18:57:02.970756 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestPolicyUpdateCommand - 2019/12/30 18:57:02.972246 [INFO] agent: Waiting for endpoints to shut down
TestPolicyUpdateCommand - 2019/12/30 18:57:02.972367 [INFO] agent: Endpoints down
--- PASS: TestPolicyUpdateCommand (5.75s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/update	6.045s
?   	github.com/hashicorp/consul/command/acl/role	[no test files]
=== RUN   TestRoleCreateCommand_noTabs
=== PAUSE TestRoleCreateCommand_noTabs
=== RUN   TestRoleCreateCommand
=== PAUSE TestRoleCreateCommand
=== CONT  TestRoleCreateCommand_noTabs
=== CONT  TestRoleCreateCommand
--- PASS: TestRoleCreateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleCreateCommand - 2019/12/30 18:56:59.920518 [WARN] agent: Node name "Node bde00aa7-2ff6-b133-1203-96408aa51334" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleCreateCommand - 2019/12/30 18:56:59.921611 [DEBUG] tlsutil: Update with version 1
TestRoleCreateCommand - 2019/12/30 18:56:59.940974 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bde00aa7-2ff6-b133-1203-96408aa51334 Address:127.0.0.1:29506}]
2019/12/30 18:57:01 [INFO]  raft: Node at 127.0.0.1:29506 [Follower] entering Follower state (Leader: "")
TestRoleCreateCommand - 2019/12/30 18:57:01.028015 [INFO] serf: EventMemberJoin: Node bde00aa7-2ff6-b133-1203-96408aa51334.dc1 127.0.0.1
TestRoleCreateCommand - 2019/12/30 18:57:01.032922 [INFO] serf: EventMemberJoin: Node bde00aa7-2ff6-b133-1203-96408aa51334 127.0.0.1
TestRoleCreateCommand - 2019/12/30 18:57:01.033903 [INFO] consul: Handled member-join event for server "Node bde00aa7-2ff6-b133-1203-96408aa51334.dc1" in area "wan"
TestRoleCreateCommand - 2019/12/30 18:57:01.034776 [INFO] agent: Started DNS server 127.0.0.1:29501 (tcp)
TestRoleCreateCommand - 2019/12/30 18:57:01.035161 [INFO] agent: Started DNS server 127.0.0.1:29501 (udp)
TestRoleCreateCommand - 2019/12/30 18:57:01.035131 [INFO] consul: Adding LAN server Node bde00aa7-2ff6-b133-1203-96408aa51334 (Addr: tcp/127.0.0.1:29506) (DC: dc1)
TestRoleCreateCommand - 2019/12/30 18:57:01.038140 [INFO] agent: Started HTTP server on 127.0.0.1:29502 (tcp)
TestRoleCreateCommand - 2019/12/30 18:57:01.038422 [INFO] agent: started state syncer
2019/12/30 18:57:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:01 [INFO]  raft: Node at 127.0.0.1:29506 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:01 [INFO]  raft: Node at 127.0.0.1:29506 [Leader] entering Leader state
TestRoleCreateCommand - 2019/12/30 18:57:01.714506 [INFO] consul: cluster leadership acquired
TestRoleCreateCommand - 2019/12/30 18:57:01.715092 [INFO] consul: New leader elected: Node bde00aa7-2ff6-b133-1203-96408aa51334
TestRoleCreateCommand - 2019/12/30 18:57:01.716891 [ERR] agent: failed to sync remote state: ACL not found
TestRoleCreateCommand - 2019/12/30 18:57:01.785088 [INFO] acl: initializing acls
TestRoleCreateCommand - 2019/12/30 18:57:02.138307 [INFO] consul: Created ACL 'global-management' policy
TestRoleCreateCommand - 2019/12/30 18:57:02.138395 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleCreateCommand - 2019/12/30 18:57:02.139536 [INFO] acl: initializing acls
TestRoleCreateCommand - 2019/12/30 18:57:02.139679 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleCreateCommand - 2019/12/30 18:57:02.370259 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleCreateCommand - 2019/12/30 18:57:02.714039 [ERR] agent: failed to sync remote state: ACL not found
TestRoleCreateCommand - 2019/12/30 18:57:02.953476 [INFO] consul: Created ACL anonymous token from configuration
TestRoleCreateCommand - 2019/12/30 18:57:02.953600 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleCreateCommand - 2019/12/30 18:57:02.953875 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleCreateCommand - 2019/12/30 18:57:02.959089 [INFO] serf: EventMemberUpdate: Node bde00aa7-2ff6-b133-1203-96408aa51334
TestRoleCreateCommand - 2019/12/30 18:57:02.959893 [INFO] serf: EventMemberUpdate: Node bde00aa7-2ff6-b133-1203-96408aa51334.dc1
TestRoleCreateCommand - 2019/12/30 18:57:02.962375 [INFO] serf: EventMemberUpdate: Node bde00aa7-2ff6-b133-1203-96408aa51334
TestRoleCreateCommand - 2019/12/30 18:57:02.963072 [INFO] serf: EventMemberUpdate: Node bde00aa7-2ff6-b133-1203-96408aa51334.dc1
TestRoleCreateCommand - 2019/12/30 18:57:04.361864 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleCreateCommand - 2019/12/30 18:57:04.362358 [DEBUG] consul: Skipping self join check for "Node bde00aa7-2ff6-b133-1203-96408aa51334" since the cluster is too small
TestRoleCreateCommand - 2019/12/30 18:57:04.362472 [INFO] consul: member 'Node bde00aa7-2ff6-b133-1203-96408aa51334' joined, marking health alive
TestRoleCreateCommand - 2019/12/30 18:57:04.527164 [DEBUG] consul: Skipping self join check for "Node bde00aa7-2ff6-b133-1203-96408aa51334" since the cluster is too small
TestRoleCreateCommand - 2019/12/30 18:57:04.527758 [DEBUG] consul: Skipping self join check for "Node bde00aa7-2ff6-b133-1203-96408aa51334" since the cluster is too small
TestRoleCreateCommand - 2019/12/30 18:57:04.842456 [DEBUG] http: Request PUT /v1/acl/policy (268.102251ms) from=127.0.0.1:44234
TestRoleCreateCommand - 2019/12/30 18:57:05.130452 [DEBUG] http: Request PUT /v1/acl/role (277.006492ms) from=127.0.0.1:44236
TestRoleCreateCommand - 2019/12/30 18:57:05.480030 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleCreateCommand - 2019/12/30 18:57:05.481844 [DEBUG] http: Request PUT /v1/acl/role (346.69071ms) from=127.0.0.1:44238
TestRoleCreateCommand - 2019/12/30 18:57:05.803994 [DEBUG] http: Request PUT /v1/acl/role (315.929211ms) from=127.0.0.1:44240
TestRoleCreateCommand - 2019/12/30 18:57:06.100338 [DEBUG] http: Request PUT /v1/acl/role (289.702502ms) from=127.0.0.1:44242
TestRoleCreateCommand - 2019/12/30 18:57:06.105178 [INFO] agent: Requesting shutdown
TestRoleCreateCommand - 2019/12/30 18:57:06.105275 [INFO] consul: shutting down server
TestRoleCreateCommand - 2019/12/30 18:57:06.105340 [WARN] serf: Shutdown without a Leave
TestRoleCreateCommand - 2019/12/30 18:57:06.177190 [WARN] serf: Shutdown without a Leave
TestRoleCreateCommand - 2019/12/30 18:57:06.269356 [INFO] manager: shutting down
TestRoleCreateCommand - 2019/12/30 18:57:06.270531 [INFO] agent: consul server down
TestRoleCreateCommand - 2019/12/30 18:57:06.270598 [INFO] agent: shutdown complete
TestRoleCreateCommand - 2019/12/30 18:57:06.270658 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (tcp)
TestRoleCreateCommand - 2019/12/30 18:57:06.270832 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (udp)
TestRoleCreateCommand - 2019/12/30 18:57:06.270990 [INFO] agent: Stopping HTTP server 127.0.0.1:29502 (tcp)
TestRoleCreateCommand - 2019/12/30 18:57:06.272313 [INFO] agent: Waiting for endpoints to shut down
TestRoleCreateCommand - 2019/12/30 18:57:06.274348 [INFO] agent: Endpoints down
--- PASS: TestRoleCreateCommand (6.45s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/create	6.691s
=== RUN   TestRoleDeleteCommand_noTabs
=== PAUSE TestRoleDeleteCommand_noTabs
=== RUN   TestRoleDeleteCommand
=== PAUSE TestRoleDeleteCommand
=== CONT  TestRoleDeleteCommand_noTabs
=== CONT  TestRoleDeleteCommand
--- PASS: TestRoleDeleteCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleDeleteCommand - 2019/12/30 18:57:18.858272 [WARN] agent: Node name "Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleDeleteCommand - 2019/12/30 18:57:18.859432 [DEBUG] tlsutil: Update with version 1
TestRoleDeleteCommand - 2019/12/30 18:57:18.867565 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:006efbb4-e27a-3a05-26b7-edff1d80c6c9 Address:127.0.0.1:40006}]
2019/12/30 18:57:19 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestRoleDeleteCommand - 2019/12/30 18:57:19.971210 [INFO] serf: EventMemberJoin: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9.dc1 127.0.0.1
TestRoleDeleteCommand - 2019/12/30 18:57:19.990157 [INFO] serf: EventMemberJoin: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9 127.0.0.1
TestRoleDeleteCommand - 2019/12/30 18:57:19.992363 [INFO] consul: Adding LAN server Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9 (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestRoleDeleteCommand - 2019/12/30 18:57:19.992856 [INFO] consul: Handled member-join event for server "Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9.dc1" in area "wan"
TestRoleDeleteCommand - 2019/12/30 18:57:19.995290 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestRoleDeleteCommand - 2019/12/30 18:57:19.997139 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestRoleDeleteCommand - 2019/12/30 18:57:20.000247 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestRoleDeleteCommand - 2019/12/30 18:57:20.000403 [INFO] agent: started state syncer
2019/12/30 18:57:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:20 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestRoleDeleteCommand - 2019/12/30 18:57:20.686846 [INFO] consul: cluster leadership acquired
TestRoleDeleteCommand - 2019/12/30 18:57:20.687378 [INFO] consul: New leader elected: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9
TestRoleDeleteCommand - 2019/12/30 18:57:20.744540 [INFO] acl: initializing acls
TestRoleDeleteCommand - 2019/12/30 18:57:20.899013 [ERR] agent: failed to sync remote state: ACL not found
TestRoleDeleteCommand - 2019/12/30 18:57:21.080276 [INFO] consul: Created ACL 'global-management' policy
TestRoleDeleteCommand - 2019/12/30 18:57:21.080375 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleDeleteCommand - 2019/12/30 18:57:21.083044 [INFO] acl: initializing acls
TestRoleDeleteCommand - 2019/12/30 18:57:21.083185 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleDeleteCommand - 2019/12/30 18:57:21.629892 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleDeleteCommand - 2019/12/30 18:57:21.630048 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleDeleteCommand - 2019/12/30 18:57:22.055050 [INFO] consul: Created ACL anonymous token from configuration
TestRoleDeleteCommand - 2019/12/30 18:57:22.056042 [INFO] serf: EventMemberUpdate: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9
TestRoleDeleteCommand - 2019/12/30 18:57:22.056712 [INFO] serf: EventMemberUpdate: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9.dc1
TestRoleDeleteCommand - 2019/12/30 18:57:22.057919 [INFO] consul: Created ACL anonymous token from configuration
TestRoleDeleteCommand - 2019/12/30 18:57:22.057992 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleDeleteCommand - 2019/12/30 18:57:22.058916 [INFO] serf: EventMemberUpdate: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9
TestRoleDeleteCommand - 2019/12/30 18:57:22.059757 [INFO] serf: EventMemberUpdate: Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9.dc1
TestRoleDeleteCommand - 2019/12/30 18:57:23.236874 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleDeleteCommand - 2019/12/30 18:57:23.237415 [DEBUG] consul: Skipping self join check for "Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9" since the cluster is too small
TestRoleDeleteCommand - 2019/12/30 18:57:23.237519 [INFO] consul: member 'Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9' joined, marking health alive
TestRoleDeleteCommand - 2019/12/30 18:57:23.527380 [DEBUG] consul: Skipping self join check for "Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9" since the cluster is too small
=== RUN   TestRoleDeleteCommand/id_or_name_required
=== RUN   TestRoleDeleteCommand/delete_works
TestRoleDeleteCommand - 2019/12/30 18:57:23.535810 [DEBUG] consul: Skipping self join check for "Node 006efbb4-e27a-3a05-26b7-edff1d80c6c9" since the cluster is too small
TestRoleDeleteCommand - 2019/12/30 18:57:23.789371 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestRoleDeleteCommand - 2019/12/30 18:57:24.146349 [INFO] agent: Synced node info
TestRoleDeleteCommand - 2019/12/30 18:57:24.146494 [DEBUG] agent: Node info in sync
TestRoleDeleteCommand - 2019/12/30 18:57:24.147680 [DEBUG] http: Request PUT /v1/acl/role (604.327994ms) from=127.0.0.1:46958
TestRoleDeleteCommand - 2019/12/30 18:57:24.447456 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleDeleteCommand - 2019/12/30 18:57:24.452004 [DEBUG] http: Request DELETE /v1/acl/role/3c04bafb-f590-31b6-35ab-88acb1df2dfb (294.539624ms) from=127.0.0.1:46960
TestRoleDeleteCommand - 2019/12/30 18:57:24.458877 [DEBUG] http: Request GET /v1/acl/role/3c04bafb-f590-31b6-35ab-88acb1df2dfb (513.68µs) from=127.0.0.1:46958
=== RUN   TestRoleDeleteCommand/delete_works_via_prefixes
TestRoleDeleteCommand - 2019/12/30 18:57:24.672581 [DEBUG] http: Request PUT /v1/acl/role (211.208372ms) from=127.0.0.1:46958
TestRoleDeleteCommand - 2019/12/30 18:57:24.684436 [DEBUG] http: Request GET /v1/acl/roles (1.71638ms) from=127.0.0.1:46962
TestRoleDeleteCommand - 2019/12/30 18:57:25.022649 [DEBUG] http: Request DELETE /v1/acl/role/ae42419d-3c6a-cdfa-2119-d6807d4c0435 (333.228335ms) from=127.0.0.1:46962
TestRoleDeleteCommand - 2019/12/30 18:57:25.025818 [DEBUG] http: Request GET /v1/acl/role/ae42419d-3c6a-cdfa-2119-d6807d4c0435 (476.346µs) from=127.0.0.1:46958
TestRoleDeleteCommand - 2019/12/30 18:57:25.026767 [INFO] agent: Requesting shutdown
TestRoleDeleteCommand - 2019/12/30 18:57:25.026841 [INFO] consul: shutting down server
TestRoleDeleteCommand - 2019/12/30 18:57:25.026887 [WARN] serf: Shutdown without a Leave
TestRoleDeleteCommand - 2019/12/30 18:57:25.138590 [WARN] serf: Shutdown without a Leave
TestRoleDeleteCommand - 2019/12/30 18:57:25.256160 [INFO] manager: shutting down
TestRoleDeleteCommand - 2019/12/30 18:57:25.256953 [INFO] agent: consul server down
TestRoleDeleteCommand - 2019/12/30 18:57:25.257019 [INFO] agent: shutdown complete
TestRoleDeleteCommand - 2019/12/30 18:57:25.257082 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestRoleDeleteCommand - 2019/12/30 18:57:25.257256 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestRoleDeleteCommand - 2019/12/30 18:57:25.257421 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestRoleDeleteCommand - 2019/12/30 18:57:25.257851 [INFO] agent: Waiting for endpoints to shut down
TestRoleDeleteCommand - 2019/12/30 18:57:25.258044 [INFO] agent: Endpoints down
--- PASS: TestRoleDeleteCommand (6.51s)
    --- PASS: TestRoleDeleteCommand/id_or_name_required (0.00s)
    --- PASS: TestRoleDeleteCommand/delete_works (0.93s)
    --- PASS: TestRoleDeleteCommand/delete_works_via_prefixes (0.57s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/delete	6.879s
=== RUN   TestRoleListCommand_noTabs
=== PAUSE TestRoleListCommand_noTabs
=== RUN   TestRoleListCommand
=== PAUSE TestRoleListCommand
=== CONT  TestRoleListCommand_noTabs
=== CONT  TestRoleListCommand
--- PASS: TestRoleListCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleListCommand - 2019/12/30 18:57:48.674676 [WARN] agent: Node name "Node ad1d85fc-cc3d-8519-8215-828b7855629d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleListCommand - 2019/12/30 18:57:48.675479 [DEBUG] tlsutil: Update with version 1
TestRoleListCommand - 2019/12/30 18:57:48.683614 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ad1d85fc-cc3d-8519-8215-828b7855629d Address:127.0.0.1:50506}]
2019/12/30 18:57:50 [INFO]  raft: Node at 127.0.0.1:50506 [Follower] entering Follower state (Leader: "")
TestRoleListCommand - 2019/12/30 18:57:50.118738 [INFO] serf: EventMemberJoin: Node ad1d85fc-cc3d-8519-8215-828b7855629d.dc1 127.0.0.1
TestRoleListCommand - 2019/12/30 18:57:50.128131 [INFO] serf: EventMemberJoin: Node ad1d85fc-cc3d-8519-8215-828b7855629d 127.0.0.1
TestRoleListCommand - 2019/12/30 18:57:50.131976 [INFO] agent: Started DNS server 127.0.0.1:50501 (udp)
TestRoleListCommand - 2019/12/30 18:57:50.135249 [INFO] consul: Adding LAN server Node ad1d85fc-cc3d-8519-8215-828b7855629d (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestRoleListCommand - 2019/12/30 18:57:50.135484 [INFO] consul: Handled member-join event for server "Node ad1d85fc-cc3d-8519-8215-828b7855629d.dc1" in area "wan"
TestRoleListCommand - 2019/12/30 18:57:50.138920 [INFO] agent: Started DNS server 127.0.0.1:50501 (tcp)
TestRoleListCommand - 2019/12/30 18:57:50.143060 [INFO] agent: Started HTTP server on 127.0.0.1:50502 (tcp)
TestRoleListCommand - 2019/12/30 18:57:50.143365 [INFO] agent: started state syncer
2019/12/30 18:57:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:50 [INFO]  raft: Node at 127.0.0.1:50506 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:50 [INFO]  raft: Node at 127.0.0.1:50506 [Leader] entering Leader state
TestRoleListCommand - 2019/12/30 18:57:50.953548 [INFO] consul: cluster leadership acquired
TestRoleListCommand - 2019/12/30 18:57:50.954181 [INFO] consul: New leader elected: Node ad1d85fc-cc3d-8519-8215-828b7855629d
TestRoleListCommand - 2019/12/30 18:57:50.968726 [ERR] agent: failed to sync remote state: ACL not found
TestRoleListCommand - 2019/12/30 18:57:51.388709 [INFO] acl: initializing acls
TestRoleListCommand - 2019/12/30 18:57:51.582124 [INFO] consul: Created ACL 'global-management' policy
TestRoleListCommand - 2019/12/30 18:57:51.582250 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleListCommand - 2019/12/30 18:57:51.686952 [INFO] acl: initializing acls
TestRoleListCommand - 2019/12/30 18:57:51.687135 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleListCommand - 2019/12/30 18:57:52.130373 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleListCommand - 2019/12/30 18:57:52.133068 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleListCommand - 2019/12/30 18:57:52.465728 [INFO] consul: Created ACL anonymous token from configuration
TestRoleListCommand - 2019/12/30 18:57:52.465895 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleListCommand - 2019/12/30 18:57:52.466804 [INFO] serf: EventMemberUpdate: Node ad1d85fc-cc3d-8519-8215-828b7855629d
TestRoleListCommand - 2019/12/30 18:57:52.470097 [INFO] serf: EventMemberUpdate: Node ad1d85fc-cc3d-8519-8215-828b7855629d.dc1
TestRoleListCommand - 2019/12/30 18:57:52.697931 [INFO] consul: Created ACL anonymous token from configuration
TestRoleListCommand - 2019/12/30 18:57:52.698962 [INFO] serf: EventMemberUpdate: Node ad1d85fc-cc3d-8519-8215-828b7855629d
TestRoleListCommand - 2019/12/30 18:57:52.699708 [INFO] serf: EventMemberUpdate: Node ad1d85fc-cc3d-8519-8215-828b7855629d.dc1
TestRoleListCommand - 2019/12/30 18:57:53.791922 [INFO] agent: Synced node info
TestRoleListCommand - 2019/12/30 18:57:53.792039 [DEBUG] agent: Node info in sync
TestRoleListCommand - 2019/12/30 18:57:54.381138 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleListCommand - 2019/12/30 18:57:54.381622 [DEBUG] consul: Skipping self join check for "Node ad1d85fc-cc3d-8519-8215-828b7855629d" since the cluster is too small
TestRoleListCommand - 2019/12/30 18:57:54.381780 [INFO] consul: member 'Node ad1d85fc-cc3d-8519-8215-828b7855629d' joined, marking health alive
TestRoleListCommand - 2019/12/30 18:57:54.386325 [DEBUG] http: Request PUT /v1/acl/role (576.420546ms) from=127.0.0.1:48586
TestRoleListCommand - 2019/12/30 18:57:54.853548 [DEBUG] consul: Skipping self join check for "Node ad1d85fc-cc3d-8519-8215-828b7855629d" since the cluster is too small
TestRoleListCommand - 2019/12/30 18:57:54.854359 [DEBUG] consul: Skipping self join check for "Node ad1d85fc-cc3d-8519-8215-828b7855629d" since the cluster is too small
TestRoleListCommand - 2019/12/30 18:57:54.861101 [DEBUG] http: Request PUT /v1/acl/role (469.93334ms) from=127.0.0.1:48586
TestRoleListCommand - 2019/12/30 18:57:55.154092 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleListCommand - 2019/12/30 18:57:55.157279 [DEBUG] http: Request PUT /v1/acl/role (292.158879ms) from=127.0.0.1:48586
TestRoleListCommand - 2019/12/30 18:57:55.366669 [DEBUG] http: Request PUT /v1/acl/role (204.700187ms) from=127.0.0.1:48586
TestRoleListCommand - 2019/12/30 18:57:55.724758 [DEBUG] http: Request PUT /v1/acl/role (347.381368ms) from=127.0.0.1:48586
TestRoleListCommand - 2019/12/30 18:57:55.731680 [DEBUG] http: Request GET /v1/acl/roles (2.819409ms) from=127.0.0.1:48588
TestRoleListCommand - 2019/12/30 18:57:55.735379 [INFO] agent: Requesting shutdown
TestRoleListCommand - 2019/12/30 18:57:55.735485 [INFO] consul: shutting down server
TestRoleListCommand - 2019/12/30 18:57:55.735542 [WARN] serf: Shutdown without a Leave
TestRoleListCommand - 2019/12/30 18:57:55.864101 [WARN] serf: Shutdown without a Leave
TestRoleListCommand - 2019/12/30 18:57:56.061771 [INFO] manager: shutting down
TestRoleListCommand - 2019/12/30 18:57:56.062638 [INFO] agent: consul server down
TestRoleListCommand - 2019/12/30 18:57:56.062712 [INFO] agent: shutdown complete
TestRoleListCommand - 2019/12/30 18:57:56.062799 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (tcp)
TestRoleListCommand - 2019/12/30 18:57:56.062962 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (udp)
TestRoleListCommand - 2019/12/30 18:57:56.063156 [INFO] agent: Stopping HTTP server 127.0.0.1:50502 (tcp)
TestRoleListCommand - 2019/12/30 18:57:56.063875 [INFO] agent: Waiting for endpoints to shut down
TestRoleListCommand - 2019/12/30 18:57:56.064011 [INFO] agent: Endpoints down
--- PASS: TestRoleListCommand (7.48s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/list	7.769s
=== RUN   TestRoleReadCommand_noTabs
=== PAUSE TestRoleReadCommand_noTabs
=== RUN   TestRoleReadCommand
=== PAUSE TestRoleReadCommand
=== CONT  TestRoleReadCommand_noTabs
=== CONT  TestRoleReadCommand
--- PASS: TestRoleReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleReadCommand - 2019/12/30 18:57:54.979694 [WARN] agent: Node name "Node 029263f1-36ef-d798-bf63-5a6660a018d1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleReadCommand - 2019/12/30 18:57:54.981245 [DEBUG] tlsutil: Update with version 1
TestRoleReadCommand - 2019/12/30 18:57:54.989855 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:57:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:029263f1-36ef-d798-bf63-5a6660a018d1 Address:127.0.0.1:46006}]
2019/12/30 18:57:56 [INFO]  raft: Node at 127.0.0.1:46006 [Follower] entering Follower state (Leader: "")
TestRoleReadCommand - 2019/12/30 18:57:56.394251 [INFO] serf: EventMemberJoin: Node 029263f1-36ef-d798-bf63-5a6660a018d1.dc1 127.0.0.1
TestRoleReadCommand - 2019/12/30 18:57:56.404586 [INFO] serf: EventMemberJoin: Node 029263f1-36ef-d798-bf63-5a6660a018d1 127.0.0.1
TestRoleReadCommand - 2019/12/30 18:57:56.406187 [INFO] consul: Adding LAN server Node 029263f1-36ef-d798-bf63-5a6660a018d1 (Addr: tcp/127.0.0.1:46006) (DC: dc1)
TestRoleReadCommand - 2019/12/30 18:57:56.406667 [INFO] consul: Handled member-join event for server "Node 029263f1-36ef-d798-bf63-5a6660a018d1.dc1" in area "wan"
TestRoleReadCommand - 2019/12/30 18:57:56.408496 [INFO] agent: Started DNS server 127.0.0.1:46001 (tcp)
TestRoleReadCommand - 2019/12/30 18:57:56.409030 [INFO] agent: Started DNS server 127.0.0.1:46001 (udp)
TestRoleReadCommand - 2019/12/30 18:57:56.412438 [INFO] agent: Started HTTP server on 127.0.0.1:46002 (tcp)
TestRoleReadCommand - 2019/12/30 18:57:56.412597 [INFO] agent: started state syncer
2019/12/30 18:57:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:57:56 [INFO]  raft: Node at 127.0.0.1:46006 [Candidate] entering Candidate state in term 2
2019/12/30 18:57:57 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:57:57 [INFO]  raft: Node at 127.0.0.1:46006 [Leader] entering Leader state
TestRoleReadCommand - 2019/12/30 18:57:57.266143 [INFO] consul: cluster leadership acquired
TestRoleReadCommand - 2019/12/30 18:57:57.266866 [INFO] consul: New leader elected: Node 029263f1-36ef-d798-bf63-5a6660a018d1
TestRoleReadCommand - 2019/12/30 18:57:57.556462 [ERR] agent: failed to sync remote state: ACL not found
TestRoleReadCommand - 2019/12/30 18:57:57.959136 [INFO] acl: initializing acls
TestRoleReadCommand - 2019/12/30 18:57:58.088955 [INFO] acl: initializing acls
TestRoleReadCommand - 2019/12/30 18:57:58.440724 [INFO] consul: Created ACL 'global-management' policy
TestRoleReadCommand - 2019/12/30 18:57:58.440874 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleReadCommand - 2019/12/30 18:57:58.819691 [ERR] agent: failed to sync remote state: ACL not found
TestRoleReadCommand - 2019/12/30 18:57:58.971616 [INFO] consul: Created ACL 'global-management' policy
TestRoleReadCommand - 2019/12/30 18:57:58.971719 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleReadCommand - 2019/12/30 18:57:59.679953 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleReadCommand - 2019/12/30 18:58:00.470979 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleReadCommand - 2019/12/30 18:58:00.471872 [INFO] consul: Created ACL anonymous token from configuration
TestRoleReadCommand - 2019/12/30 18:58:00.471984 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleReadCommand - 2019/12/30 18:58:00.473015 [INFO] serf: EventMemberUpdate: Node 029263f1-36ef-d798-bf63-5a6660a018d1
TestRoleReadCommand - 2019/12/30 18:58:00.474626 [INFO] serf: EventMemberUpdate: Node 029263f1-36ef-d798-bf63-5a6660a018d1.dc1
TestRoleReadCommand - 2019/12/30 18:58:00.713168 [INFO] consul: Created ACL anonymous token from configuration
TestRoleReadCommand - 2019/12/30 18:58:00.714226 [INFO] serf: EventMemberUpdate: Node 029263f1-36ef-d798-bf63-5a6660a018d1
TestRoleReadCommand - 2019/12/30 18:58:00.715262 [INFO] serf: EventMemberUpdate: Node 029263f1-36ef-d798-bf63-5a6660a018d1.dc1
TestRoleReadCommand - 2019/12/30 18:58:02.015507 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleReadCommand - 2019/12/30 18:58:02.016099 [DEBUG] consul: Skipping self join check for "Node 029263f1-36ef-d798-bf63-5a6660a018d1" since the cluster is too small
TestRoleReadCommand - 2019/12/30 18:58:02.016242 [INFO] consul: member 'Node 029263f1-36ef-d798-bf63-5a6660a018d1' joined, marking health alive
TestRoleReadCommand - 2019/12/30 18:58:02.290064 [DEBUG] consul: Skipping self join check for "Node 029263f1-36ef-d798-bf63-5a6660a018d1" since the cluster is too small
TestRoleReadCommand - 2019/12/30 18:58:02.290771 [DEBUG] consul: Skipping self join check for "Node 029263f1-36ef-d798-bf63-5a6660a018d1" since the cluster is too small
=== RUN   TestRoleReadCommand/id_or_name_required
=== RUN   TestRoleReadCommand/read_by_id_not_found
TestRoleReadCommand - 2019/12/30 18:58:02.326303 [DEBUG] http: Request GET /v1/acl/role/ede641f7-6805-81b3-d947-7c2cdda09a13 (5.458814ms) from=127.0.0.1:50846
=== RUN   TestRoleReadCommand/read_by_name_not_found
TestRoleReadCommand - 2019/12/30 18:58:02.332231 [DEBUG] http: Request GET /v1/acl/role/name/blah (664.684µs) from=127.0.0.1:50848
=== RUN   TestRoleReadCommand/read_by_id
TestRoleReadCommand - 2019/12/30 18:58:03.351691 [DEBUG] http: Request PUT /v1/acl/role (1.0065578s) from=127.0.0.1:50850
TestRoleReadCommand - 2019/12/30 18:58:03.352012 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleReadCommand - 2019/12/30 18:58:03.379974 [DEBUG] http: Request GET /v1/acl/role/33b295ea-9d7d-86a3-b558-e408c10fbcd6 (5.780156ms) from=127.0.0.1:50852
=== RUN   TestRoleReadCommand/read_by_id_prefix
TestRoleReadCommand - 2019/12/30 18:58:03.613917 [DEBUG] http: Request PUT /v1/acl/role (227.5338ms) from=127.0.0.1:50850
TestRoleReadCommand - 2019/12/30 18:58:03.625624 [DEBUG] http: Request GET /v1/acl/roles (1.985054ms) from=127.0.0.1:50854
TestRoleReadCommand - 2019/12/30 18:58:03.642504 [DEBUG] http: Request GET /v1/acl/role/9d1c1440-5f64-7d7f-19a5-f554b47104c9 (1.332703ms) from=127.0.0.1:50854
=== RUN   TestRoleReadCommand/read_by_name
TestRoleReadCommand - 2019/12/30 18:58:03.934271 [DEBUG] http: Request PUT /v1/acl/role (287.780757ms) from=127.0.0.1:50850
TestRoleReadCommand - 2019/12/30 18:58:03.944675 [DEBUG] http: Request GET /v1/acl/role/name/test-role-by-name (3.313089ms) from=127.0.0.1:50856
TestRoleReadCommand - 2019/12/30 18:58:03.949555 [INFO] agent: Requesting shutdown
TestRoleReadCommand - 2019/12/30 18:58:03.949670 [INFO] consul: shutting down server
TestRoleReadCommand - 2019/12/30 18:58:03.949790 [WARN] serf: Shutdown without a Leave
TestRoleReadCommand - 2019/12/30 18:58:04.095156 [WARN] serf: Shutdown without a Leave
TestRoleReadCommand - 2019/12/30 18:58:04.172515 [INFO] manager: shutting down
TestRoleReadCommand - 2019/12/30 18:58:04.173717 [INFO] agent: consul server down
TestRoleReadCommand - 2019/12/30 18:58:04.173795 [INFO] agent: shutdown complete
TestRoleReadCommand - 2019/12/30 18:58:04.173854 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (tcp)
TestRoleReadCommand - 2019/12/30 18:58:04.174029 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (udp)
TestRoleReadCommand - 2019/12/30 18:58:04.174212 [INFO] agent: Stopping HTTP server 127.0.0.1:46002 (tcp)
TestRoleReadCommand - 2019/12/30 18:58:04.176393 [INFO] agent: Waiting for endpoints to shut down
TestRoleReadCommand - 2019/12/30 18:58:04.178470 [INFO] agent: Endpoints down
--- PASS: TestRoleReadCommand (9.28s)
    --- PASS: TestRoleReadCommand/id_or_name_required (0.00s)
    --- PASS: TestRoleReadCommand/read_by_id_not_found (0.01s)
    --- PASS: TestRoleReadCommand/read_by_name_not_found (0.01s)
    --- PASS: TestRoleReadCommand/read_by_id (1.05s)
    --- PASS: TestRoleReadCommand/read_by_id_prefix (0.26s)
    --- PASS: TestRoleReadCommand/read_by_name (0.30s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/read	9.559s
=== RUN   TestRoleUpdateCommand_noTabs
=== PAUSE TestRoleUpdateCommand_noTabs
=== RUN   TestRoleUpdateCommand
=== PAUSE TestRoleUpdateCommand
=== RUN   TestRoleUpdateCommand_noMerge
=== PAUSE TestRoleUpdateCommand_noMerge
=== CONT  TestRoleUpdateCommand_noTabs
=== CONT  TestRoleUpdateCommand_noMerge
=== CONT  TestRoleUpdateCommand
--- PASS: TestRoleUpdateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleUpdateCommand - 2019/12/30 18:58:12.213000 [WARN] agent: Node name "Node 9e49505d-5adc-c19c-faf2-741e0885bb84" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleUpdateCommand - 2019/12/30 18:58:12.213996 [DEBUG] tlsutil: Update with version 1
TestRoleUpdateCommand - 2019/12/30 18:58:12.221028 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:12.222720 [WARN] agent: Node name "Node 6e885dc3-d1f9-1632-f9fd-03d05f881230" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:12.223299 [DEBUG] tlsutil: Update with version 1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:12.225835 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:58:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9e49505d-5adc-c19c-faf2-741e0885bb84 Address:127.0.0.1:35506}]
2019/12/30 18:58:13 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
2019/12/30 18:58:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6e885dc3-d1f9-1632-f9fd-03d05f881230 Address:127.0.0.1:35512}]
2019/12/30 18:58:13 [INFO]  raft: Node at 127.0.0.1:35512 [Follower] entering Follower state (Leader: "")
TestRoleUpdateCommand - 2019/12/30 18:58:13.194280 [INFO] serf: EventMemberJoin: Node 9e49505d-5adc-c19c-faf2-741e0885bb84.dc1 127.0.0.1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.195684 [INFO] serf: EventMemberJoin: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230.dc1 127.0.0.1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.205458 [INFO] serf: EventMemberJoin: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230 127.0.0.1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.212524 [INFO] consul: Adding LAN server Node 6e885dc3-d1f9-1632-f9fd-03d05f881230 (Addr: tcp/127.0.0.1:35512) (DC: dc1)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.220914 [INFO] agent: Started DNS server 127.0.0.1:35507 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.221528 [INFO] consul: Handled member-join event for server "Node 6e885dc3-d1f9-1632-f9fd-03d05f881230.dc1" in area "wan"
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.221992 [INFO] agent: Started DNS server 127.0.0.1:35507 (udp)
TestRoleUpdateCommand - 2019/12/30 18:58:13.226285 [INFO] serf: EventMemberJoin: Node 9e49505d-5adc-c19c-faf2-741e0885bb84 127.0.0.1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.226970 [INFO] agent: Started HTTP server on 127.0.0.1:35508 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.227093 [INFO] agent: started state syncer
TestRoleUpdateCommand - 2019/12/30 18:58:13.228029 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
2019/12/30 18:58:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:13 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
TestRoleUpdateCommand - 2019/12/30 18:58:13.228253 [INFO] consul: Handled member-join event for server "Node 9e49505d-5adc-c19c-faf2-741e0885bb84.dc1" in area "wan"
TestRoleUpdateCommand - 2019/12/30 18:58:13.228809 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestRoleUpdateCommand - 2019/12/30 18:58:13.229759 [INFO] consul: Adding LAN server Node 9e49505d-5adc-c19c-faf2-741e0885bb84 (Addr: tcp/127.0.0.1:35506) (DC: dc1)
2019/12/30 18:58:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:13 [INFO]  raft: Node at 127.0.0.1:35512 [Candidate] entering Candidate state in term 2
TestRoleUpdateCommand - 2019/12/30 18:58:13.231345 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestRoleUpdateCommand - 2019/12/30 18:58:13.231442 [INFO] agent: started state syncer
2019/12/30 18:58:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:13 [INFO]  raft: Node at 127.0.0.1:35512 [Leader] entering Leader state
2019/12/30 18:58:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:13 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.704214 [INFO] consul: cluster leadership acquired
TestRoleUpdateCommand - 2019/12/30 18:58:13.704593 [INFO] consul: cluster leadership acquired
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.704794 [INFO] consul: New leader elected: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230
TestRoleUpdateCommand - 2019/12/30 18:58:13.704967 [INFO] consul: New leader elected: Node 9e49505d-5adc-c19c-faf2-741e0885bb84
TestRoleUpdateCommand - 2019/12/30 18:58:13.738438 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.882093 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:13.971939 [INFO] acl: initializing acls
TestRoleUpdateCommand - 2019/12/30 18:58:13.979784 [INFO] acl: initializing acls
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:14.013492 [INFO] acl: initializing acls
TestRoleUpdateCommand - 2019/12/30 18:58:14.162808 [INFO] consul: Created ACL 'global-management' policy
TestRoleUpdateCommand - 2019/12/30 18:58:14.162914 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand - 2019/12/30 18:58:14.163016 [INFO] acl: initializing acls
TestRoleUpdateCommand - 2019/12/30 18:58:14.163108 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:14.162807 [INFO] consul: Created ACL 'global-management' policy
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:14.163592 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand - 2019/12/30 18:58:14.255397 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:14.397940 [INFO] consul: Created ACL 'global-management' policy
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:14.398030 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand - 2019/12/30 18:58:14.909298 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:14.912425 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand - 2019/12/30 18:58:14.912870 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.074204 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand - 2019/12/30 18:58:15.330322 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand - 2019/12/30 18:58:15.330431 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleUpdateCommand - 2019/12/30 18:58:15.331055 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand - 2019/12/30 18:58:15.331207 [INFO] serf: EventMemberUpdate: Node 9e49505d-5adc-c19c-faf2-741e0885bb84
TestRoleUpdateCommand - 2019/12/30 18:58:15.331778 [INFO] serf: EventMemberUpdate: Node 9e49505d-5adc-c19c-faf2-741e0885bb84.dc1
TestRoleUpdateCommand - 2019/12/30 18:58:15.332088 [INFO] serf: EventMemberUpdate: Node 9e49505d-5adc-c19c-faf2-741e0885bb84
TestRoleUpdateCommand - 2019/12/30 18:58:15.332751 [INFO] serf: EventMemberUpdate: Node 9e49505d-5adc-c19c-faf2-741e0885bb84.dc1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.397280 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.397419 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.398471 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.398711 [INFO] serf: EventMemberUpdate: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.399778 [INFO] serf: EventMemberUpdate: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.400680 [INFO] serf: EventMemberUpdate: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230.dc1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:15.401940 [INFO] serf: EventMemberUpdate: Node 6e885dc3-d1f9-1632-f9fd-03d05f881230.dc1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:17.413785 [INFO] agent: Synced node info
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:17.413909 [DEBUG] agent: Node info in sync
TestRoleUpdateCommand - 2019/12/30 18:58:17.756206 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleUpdateCommand - 2019/12/30 18:58:17.756634 [DEBUG] consul: Skipping self join check for "Node 9e49505d-5adc-c19c-faf2-741e0885bb84" since the cluster is too small
TestRoleUpdateCommand - 2019/12/30 18:58:17.756731 [INFO] consul: member 'Node 9e49505d-5adc-c19c-faf2-741e0885bb84' joined, marking health alive
TestRoleUpdateCommand - 2019/12/30 18:58:17.759509 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleUpdateCommand - 2019/12/30 18:58:18.057102 [DEBUG] consul: Skipping self join check for "Node 9e49505d-5adc-c19c-faf2-741e0885bb84" since the cluster is too small
TestRoleUpdateCommand - 2019/12/30 18:58:18.057669 [DEBUG] consul: Skipping self join check for "Node 9e49505d-5adc-c19c-faf2-741e0885bb84" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:18.206841 [DEBUG] http: Request PUT /v1/acl/policy (769.468726ms) from=127.0.0.1:40272
TestRoleUpdateCommand - 2019/12/30 18:58:18.447848 [DEBUG] http: Request PUT /v1/acl/policy (332.374285ms) from=127.0.0.1:58120
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:18.522470 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleUpdateCommand - 2019/12/30 18:58:18.731230 [DEBUG] http: Request PUT /v1/acl/policy (279.55453ms) from=127.0.0.1:58120
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:18.803988 [DEBUG] consul: Skipping self join check for "Node 6e885dc3-d1f9-1632-f9fd-03d05f881230" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:18.804232 [INFO] consul: member 'Node 6e885dc3-d1f9-1632-f9fd-03d05f881230' joined, marking health alive
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:18.810292 [DEBUG] http: Request PUT /v1/acl/policy (599.408811ms) from=127.0.0.1:40272
TestRoleUpdateCommand - 2019/12/30 18:58:19.056019 [DEBUG] http: Request PUT /v1/acl/role (321.383322ms) from=127.0.0.1:58120
=== RUN   TestRoleUpdateCommand/update_a_role_that_does_not_exist
TestRoleUpdateCommand - 2019/12/30 18:58:19.063127 [DEBUG] http: Request GET /v1/acl/role/1b3e71c5-4766-5c76-7ac6-dd0403c6d56b (657.685µs) from=127.0.0.1:58122
=== RUN   TestRoleUpdateCommand/update_with_policy_by_name
TestRoleUpdateCommand - 2019/12/30 18:58:19.074066 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.451372ms) from=127.0.0.1:58124
TestRoleUpdateCommand - 2019/12/30 18:58:19.397500 [DEBUG] http: Request PUT /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (320.816307ms) from=127.0.0.1:58124
TestRoleUpdateCommand - 2019/12/30 18:58:19.403435 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (796.688µs) from=127.0.0.1:58120
=== RUN   TestRoleUpdateCommand/update_with_policy_by_id
TestRoleUpdateCommand - 2019/12/30 18:58:19.415247 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.111363ms) from=127.0.0.1:58126
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.497601 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.499205 [DEBUG] consul: Skipping self join check for "Node 6e885dc3-d1f9-1632-f9fd-03d05f881230" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.500097 [DEBUG] consul: Skipping self join check for "Node 6e885dc3-d1f9-1632-f9fd-03d05f881230" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.500123 [DEBUG] http: Request PUT /v1/acl/policy (684.480768ms) from=127.0.0.1:40272
=== RUN   TestRoleUpdateCommand_noMerge/update_a_role_that_does_not_exist
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.508396 [DEBUG] http: Request GET /v1/acl/role/a0bbaced-d520-bb7f-238b-0685c52e6b7d (584.683µs) from=127.0.0.1:40282
=== RUN   TestRoleUpdateCommand_noMerge/update_with_policy_by_name
TestRoleUpdateCommand - 2019/12/30 18:58:19.739957 [DEBUG] http: Request PUT /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (321.986339ms) from=127.0.0.1:58126
TestRoleUpdateCommand - 2019/12/30 18:58:19.745259 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.794381ms) from=127.0.0.1:58120
=== RUN   TestRoleUpdateCommand/update_with_service_identity
TestRoleUpdateCommand - 2019/12/30 18:58:19.754563 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.334036ms) from=127.0.0.1:58130
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.830671 [DEBUG] http: Request PUT /v1/acl/role (320.05662ms) from=127.0.0.1:40272
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:19.838896 [DEBUG] http: Request GET /v1/acl/role/0cc79a18-be24-54db-40ec-d5de276903bb (1.321035ms) from=127.0.0.1:40286
TestRoleUpdateCommand - 2019/12/30 18:58:20.039497 [DEBUG] http: Request PUT /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (281.761588ms) from=127.0.0.1:58130
TestRoleUpdateCommand - 2019/12/30 18:58:20.043441 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.033027ms) from=127.0.0.1:58120
=== RUN   TestRoleUpdateCommand/update_with_service_identity_scoped_to_2_DCs
TestRoleUpdateCommand - 2019/12/30 18:58:20.052067 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.100029ms) from=127.0.0.1:58134
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.113854 [DEBUG] http: Request PUT /v1/acl/role/0cc79a18-be24-54db-40ec-d5de276903bb (270.519619ms) from=127.0.0.1:40286
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.118693 [DEBUG] http: Request GET /v1/acl/role/0cc79a18-be24-54db-40ec-d5de276903bb (973.36µs) from=127.0.0.1:40272
=== RUN   TestRoleUpdateCommand_noMerge/update_with_policy_by_id
TestRoleUpdateCommand - 2019/12/30 18:58:20.273013 [DEBUG] http: Request PUT /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (217.876201ms) from=127.0.0.1:58134
TestRoleUpdateCommand - 2019/12/30 18:58:20.277823 [DEBUG] http: Request GET /v1/acl/role/8ffc0124-74c2-bc05-4890-69947c8b2945 (1.014694ms) from=127.0.0.1:58120
TestRoleUpdateCommand - 2019/12/30 18:58:20.280380 [INFO] agent: Requesting shutdown
TestRoleUpdateCommand - 2019/12/30 18:58:20.280632 [INFO] consul: shutting down server
TestRoleUpdateCommand - 2019/12/30 18:58:20.280845 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.347091 [DEBUG] http: Request PUT /v1/acl/role (225.181398ms) from=127.0.0.1:40272
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.355361 [DEBUG] http: Request GET /v1/acl/role/a7bf20eb-d119-a2b5-d6e4-cd146e66edd4 (1.302701ms) from=127.0.0.1:40290
TestRoleUpdateCommand - 2019/12/30 18:58:20.428813 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand - 2019/12/30 18:58:20.545546 [INFO] manager: shutting down
TestRoleUpdateCommand - 2019/12/30 18:58:20.546313 [INFO] agent: consul server down
TestRoleUpdateCommand - 2019/12/30 18:58:20.546413 [INFO] agent: shutdown complete
TestRoleUpdateCommand - 2019/12/30 18:58:20.546509 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestRoleUpdateCommand - 2019/12/30 18:58:20.546703 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestRoleUpdateCommand - 2019/12/30 18:58:20.546865 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestRoleUpdateCommand - 2019/12/30 18:58:20.548129 [INFO] agent: Waiting for endpoints to shut down
TestRoleUpdateCommand - 2019/12/30 18:58:20.548297 [INFO] agent: Endpoints down
--- PASS: TestRoleUpdateCommand (8.48s)
    --- PASS: TestRoleUpdateCommand/update_a_role_that_does_not_exist (0.01s)
    --- PASS: TestRoleUpdateCommand/update_with_policy_by_name (0.34s)
    --- PASS: TestRoleUpdateCommand/update_with_policy_by_id (0.34s)
    --- PASS: TestRoleUpdateCommand/update_with_service_identity (0.30s)
    --- PASS: TestRoleUpdateCommand/update_with_service_identity_scoped_to_2_DCs (0.23s)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.606457 [DEBUG] http: Request PUT /v1/acl/role/a7bf20eb-d119-a2b5-d6e4-cd146e66edd4 (248.314021ms) from=127.0.0.1:40290
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.610337 [DEBUG] http: Request GET /v1/acl/role/a7bf20eb-d119-a2b5-d6e4-cd146e66edd4 (1.102363ms) from=127.0.0.1:40272
=== RUN   TestRoleUpdateCommand_noMerge/update_with_service_identity
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.788564 [DEBUG] http: Request PUT /v1/acl/role (174.898377ms) from=127.0.0.1:40272
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.796356 [DEBUG] http: Request GET /v1/acl/role/55b42ed4-f7b8-a861-68f4-d58d9520ba23 (1.144364ms) from=127.0.0.1:40292
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.938793 [DEBUG] http: Request PUT /v1/acl/role/55b42ed4-f7b8-a861-68f4-d58d9520ba23 (139.711763ms) from=127.0.0.1:40292
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:20.942683 [DEBUG] http: Request GET /v1/acl/role/55b42ed4-f7b8-a861-68f4-d58d9520ba23 (972.026µs) from=127.0.0.1:40272
=== RUN   TestRoleUpdateCommand_noMerge/update_with_service_identity_scoped_to_2_DCs
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.122238 [DEBUG] http: Request PUT /v1/acl/role (175.991073ms) from=127.0.0.1:40272
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.130354 [DEBUG] http: Request GET /v1/acl/role/3c0a8cd3-c734-c46d-e4cc-aee7335f18b2 (1.402038ms) from=127.0.0.1:40294
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.272345 [DEBUG] http: Request PUT /v1/acl/role/3c0a8cd3-c734-c46d-e4cc-aee7335f18b2 (138.6064ms) from=127.0.0.1:40294
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.276630 [DEBUG] http: Request GET /v1/acl/role/3c0a8cd3-c734-c46d-e4cc-aee7335f18b2 (1.096697ms) from=127.0.0.1:40272
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.279349 [INFO] agent: Requesting shutdown
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.279497 [INFO] consul: shutting down server
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.279551 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.337177 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.412379 [INFO] manager: shutting down
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.413319 [INFO] agent: consul server down
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.413388 [INFO] agent: shutdown complete
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.413448 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.413620 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (udp)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.413798 [INFO] agent: Stopping HTTP server 127.0.0.1:35508 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.415444 [INFO] agent: Waiting for endpoints to shut down
TestRoleUpdateCommand_noMerge - 2019/12/30 18:58:21.415729 [INFO] agent: Endpoints down
--- PASS: TestRoleUpdateCommand_noMerge (9.35s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_a_role_that_does_not_exist (0.01s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_policy_by_name (0.61s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_policy_by_id (0.49s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_service_identity (0.33s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_service_identity_scoped_to_2_DCs (0.33s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/update	9.625s
=== RUN   TestRulesTranslateCommand_noTabs
=== PAUSE TestRulesTranslateCommand_noTabs
=== RUN   TestRulesTranslateCommand
=== PAUSE TestRulesTranslateCommand
=== CONT  TestRulesTranslateCommand_noTabs
=== CONT  TestRulesTranslateCommand
--- PASS: TestRulesTranslateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRulesTranslateCommand - 2019/12/30 18:58:43.377631 [WARN] agent: Node name "Node 4279da65-a337-0246-2d89-8efe9937602c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRulesTranslateCommand - 2019/12/30 18:58:43.379172 [DEBUG] tlsutil: Update with version 1
TestRulesTranslateCommand - 2019/12/30 18:58:43.387223 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:58:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4279da65-a337-0246-2d89-8efe9937602c Address:127.0.0.1:34006}]
2019/12/30 18:58:44 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
TestRulesTranslateCommand - 2019/12/30 18:58:44.162445 [INFO] serf: EventMemberJoin: Node 4279da65-a337-0246-2d89-8efe9937602c.dc1 127.0.0.1
TestRulesTranslateCommand - 2019/12/30 18:58:44.170223 [INFO] serf: EventMemberJoin: Node 4279da65-a337-0246-2d89-8efe9937602c 127.0.0.1
TestRulesTranslateCommand - 2019/12/30 18:58:44.172196 [INFO] consul: Handled member-join event for server "Node 4279da65-a337-0246-2d89-8efe9937602c.dc1" in area "wan"
TestRulesTranslateCommand - 2019/12/30 18:58:44.172661 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestRulesTranslateCommand - 2019/12/30 18:58:44.173038 [INFO] consul: Adding LAN server Node 4279da65-a337-0246-2d89-8efe9937602c (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestRulesTranslateCommand - 2019/12/30 18:58:44.175395 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestRulesTranslateCommand - 2019/12/30 18:58:44.178831 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestRulesTranslateCommand - 2019/12/30 18:58:44.178961 [INFO] agent: started state syncer
2019/12/30 18:58:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:44 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
2019/12/30 18:58:44 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:44 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestRulesTranslateCommand - 2019/12/30 18:58:44.638357 [INFO] consul: cluster leadership acquired
TestRulesTranslateCommand - 2019/12/30 18:58:44.638901 [INFO] consul: New leader elected: Node 4279da65-a337-0246-2d89-8efe9937602c
TestRulesTranslateCommand - 2019/12/30 18:58:44.780488 [ERR] agent: failed to sync remote state: ACL not found
TestRulesTranslateCommand - 2019/12/30 18:58:44.923968 [INFO] acl: initializing acls
TestRulesTranslateCommand - 2019/12/30 18:58:44.946411 [INFO] acl: initializing acls
TestRulesTranslateCommand - 2019/12/30 18:58:45.088737 [INFO] consul: Created ACL 'global-management' policy
TestRulesTranslateCommand - 2019/12/30 18:58:45.088838 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRulesTranslateCommand - 2019/12/30 18:58:45.452738 [INFO] consul: Created ACL 'global-management' policy
TestRulesTranslateCommand - 2019/12/30 18:58:45.452820 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRulesTranslateCommand - 2019/12/30 18:58:45.458348 [INFO] consul: Bootstrapped ACL master token from configuration
TestRulesTranslateCommand - 2019/12/30 18:58:45.588117 [INFO] consul: Bootstrapped ACL master token from configuration
TestRulesTranslateCommand - 2019/12/30 18:58:45.747249 [INFO] consul: Created ACL anonymous token from configuration
TestRulesTranslateCommand - 2019/12/30 18:58:45.747383 [DEBUG] acl: transitioning out of legacy ACL mode
TestRulesTranslateCommand - 2019/12/30 18:58:45.748289 [INFO] serf: EventMemberUpdate: Node 4279da65-a337-0246-2d89-8efe9937602c
TestRulesTranslateCommand - 2019/12/30 18:58:45.748902 [INFO] serf: EventMemberUpdate: Node 4279da65-a337-0246-2d89-8efe9937602c.dc1
TestRulesTranslateCommand - 2019/12/30 18:58:45.905510 [INFO] consul: Created ACL anonymous token from configuration
TestRulesTranslateCommand - 2019/12/30 18:58:45.906407 [INFO] serf: EventMemberUpdate: Node 4279da65-a337-0246-2d89-8efe9937602c
TestRulesTranslateCommand - 2019/12/30 18:58:45.907062 [INFO] serf: EventMemberUpdate: Node 4279da65-a337-0246-2d89-8efe9937602c.dc1
TestRulesTranslateCommand - 2019/12/30 18:58:46.247010 [INFO] agent: Synced node info
TestRulesTranslateCommand - 2019/12/30 18:58:46.247126 [DEBUG] agent: Node info in sync
=== RUN   TestRulesTranslateCommand/file
=== RUN   TestRulesTranslateCommand/stdin
=== RUN   TestRulesTranslateCommand/arg
=== RUN   TestRulesTranslateCommand/exclusive-options
TestRulesTranslateCommand - 2019/12/30 18:58:46.283257 [INFO] agent: Requesting shutdown
TestRulesTranslateCommand - 2019/12/30 18:58:46.283334 [INFO] consul: shutting down server
TestRulesTranslateCommand - 2019/12/30 18:58:46.283376 [WARN] serf: Shutdown without a Leave
TestRulesTranslateCommand - 2019/12/30 18:58:46.421076 [WARN] serf: Shutdown without a Leave
TestRulesTranslateCommand - 2019/12/30 18:58:46.554563 [INFO] manager: shutting down
TestRulesTranslateCommand - 2019/12/30 18:58:46.662828 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestRulesTranslateCommand - 2019/12/30 18:58:46.663065 [INFO] agent: consul server down
TestRulesTranslateCommand - 2019/12/30 18:58:46.663112 [INFO] agent: shutdown complete
TestRulesTranslateCommand - 2019/12/30 18:58:46.663166 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestRulesTranslateCommand - 2019/12/30 18:58:46.663288 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestRulesTranslateCommand - 2019/12/30 18:58:46.663442 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestRulesTranslateCommand - 2019/12/30 18:58:46.663635 [INFO] agent: Waiting for endpoints to shut down
TestRulesTranslateCommand - 2019/12/30 18:58:46.663698 [INFO] agent: Endpoints down
--- PASS: TestRulesTranslateCommand (3.37s)
    --- PASS: TestRulesTranslateCommand/file (0.01s)
    --- PASS: TestRulesTranslateCommand/stdin (0.00s)
    --- PASS: TestRulesTranslateCommand/arg (0.00s)
    --- PASS: TestRulesTranslateCommand/exclusive-options (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/acl/rules	3.680s
?   	github.com/hashicorp/consul/command/acl/token	[no test files]
=== RUN   TestTokenCloneCommand_noTabs
=== PAUSE TestTokenCloneCommand_noTabs
=== RUN   TestTokenCloneCommand
=== PAUSE TestTokenCloneCommand
=== CONT  TestTokenCloneCommand_noTabs
--- PASS: TestTokenCloneCommand_noTabs (0.00s)
=== CONT  TestTokenCloneCommand
WARNING: bootstrap = true: do not enable unless necessary
TestTokenCloneCommand - 2019/12/30 18:58:51.251171 [WARN] agent: Node name "Node 55d53c96-17b0-7648-ef98-025fc35eeeb7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenCloneCommand - 2019/12/30 18:58:51.252438 [DEBUG] tlsutil: Update with version 1
TestTokenCloneCommand - 2019/12/30 18:58:51.266658 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:58:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:55d53c96-17b0-7648-ef98-025fc35eeeb7 Address:127.0.0.1:23506}]
2019/12/30 18:58:52 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestTokenCloneCommand - 2019/12/30 18:58:52.021185 [INFO] serf: EventMemberJoin: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7.dc1 127.0.0.1
TestTokenCloneCommand - 2019/12/30 18:58:52.027064 [INFO] serf: EventMemberJoin: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7 127.0.0.1
TestTokenCloneCommand - 2019/12/30 18:58:52.029274 [INFO] consul: Handled member-join event for server "Node 55d53c96-17b0-7648-ef98-025fc35eeeb7.dc1" in area "wan"
TestTokenCloneCommand - 2019/12/30 18:58:52.029290 [INFO] consul: Adding LAN server Node 55d53c96-17b0-7648-ef98-025fc35eeeb7 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestTokenCloneCommand - 2019/12/30 18:58:52.030223 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestTokenCloneCommand - 2019/12/30 18:58:52.030492 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestTokenCloneCommand - 2019/12/30 18:58:52.033316 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestTokenCloneCommand - 2019/12/30 18:58:52.033569 [INFO] agent: started state syncer
2019/12/30 18:58:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:58:52 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 18:58:52 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:58:52 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestTokenCloneCommand - 2019/12/30 18:58:52.505146 [INFO] consul: cluster leadership acquired
TestTokenCloneCommand - 2019/12/30 18:58:52.505770 [INFO] consul: New leader elected: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7
TestTokenCloneCommand - 2019/12/30 18:58:52.738118 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCloneCommand - 2019/12/30 18:58:52.781071 [INFO] acl: initializing acls
TestTokenCloneCommand - 2019/12/30 18:58:52.846539 [INFO] acl: initializing acls
TestTokenCloneCommand - 2019/12/30 18:58:52.997262 [INFO] consul: Created ACL 'global-management' policy
TestTokenCloneCommand - 2019/12/30 18:58:52.997371 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCloneCommand - 2019/12/30 18:58:53.330798 [INFO] consul: Created ACL 'global-management' policy
TestTokenCloneCommand - 2019/12/30 18:58:53.330896 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCloneCommand - 2019/12/30 18:58:53.332688 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCloneCommand - 2019/12/30 18:58:53.497085 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCloneCommand - 2019/12/30 18:58:53.497794 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCloneCommand - 2019/12/30 18:58:53.497893 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenCloneCommand - 2019/12/30 18:58:53.498745 [INFO] serf: EventMemberUpdate: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7
TestTokenCloneCommand - 2019/12/30 18:58:53.499435 [INFO] serf: EventMemberUpdate: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7.dc1
TestTokenCloneCommand - 2019/12/30 18:58:53.672340 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCloneCommand - 2019/12/30 18:58:53.673290 [INFO] serf: EventMemberUpdate: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7
TestTokenCloneCommand - 2019/12/30 18:58:53.673998 [INFO] serf: EventMemberUpdate: Node 55d53c96-17b0-7648-ef98-025fc35eeeb7.dc1
TestTokenCloneCommand - 2019/12/30 18:58:54.581428 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenCloneCommand - 2019/12/30 18:58:54.582545 [DEBUG] consul: Skipping self join check for "Node 55d53c96-17b0-7648-ef98-025fc35eeeb7" since the cluster is too small
TestTokenCloneCommand - 2019/12/30 18:58:54.582753 [INFO] consul: member 'Node 55d53c96-17b0-7648-ef98-025fc35eeeb7' joined, marking health alive
TestTokenCloneCommand - 2019/12/30 18:58:54.732936 [DEBUG] consul: Skipping self join check for "Node 55d53c96-17b0-7648-ef98-025fc35eeeb7" since the cluster is too small
TestTokenCloneCommand - 2019/12/30 18:58:54.733583 [DEBUG] consul: Skipping self join check for "Node 55d53c96-17b0-7648-ef98-025fc35eeeb7" since the cluster is too small
TestTokenCloneCommand - 2019/12/30 18:58:54.957267 [DEBUG] http: Request PUT /v1/acl/policy (203.051459ms) from=127.0.0.1:37762
TestTokenCloneCommand - 2019/12/30 18:58:55.124637 [DEBUG] http: Request PUT /v1/acl/token (162.075358ms) from=127.0.0.1:37762
=== RUN   TestTokenCloneCommand/Description
TestTokenCloneCommand - 2019/12/30 18:58:55.273707 [DEBUG] http: Request PUT /v1/acl/token/0a2bebaf-dfe3-75ce-7b9a-3e551886943b/clone (140.030098ms) from=127.0.0.1:37764
TestTokenCloneCommand - 2019/12/30 18:58:55.281206 [DEBUG] http: Request GET /v1/acl/token/7efb2b6b-989f-ad6f-8394-63708133565a (1.517707ms) from=127.0.0.1:37762
=== RUN   TestTokenCloneCommand/Without_Description
TestTokenCloneCommand - 2019/12/30 18:58:55.440522 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestTokenCloneCommand - 2019/12/30 18:58:55.482045 [DEBUG] http: Request PUT /v1/acl/token/0a2bebaf-dfe3-75ce-7b9a-3e551886943b/clone (186.563015ms) from=127.0.0.1:37766
TestTokenCloneCommand - 2019/12/30 18:58:55.630484 [INFO] agent: Synced node info
TestTokenCloneCommand - 2019/12/30 18:58:55.630788 [DEBUG] agent: Node info in sync
TestTokenCloneCommand - 2019/12/30 18:58:55.631801 [DEBUG] http: Request GET /v1/acl/token/1bf8cccb-52dd-ddeb-84e6-2c34541311dd (144.858894ms) from=127.0.0.1:37762
TestTokenCloneCommand - 2019/12/30 18:58:55.634118 [INFO] agent: Requesting shutdown
TestTokenCloneCommand - 2019/12/30 18:58:55.634207 [INFO] consul: shutting down server
TestTokenCloneCommand - 2019/12/30 18:58:55.634254 [WARN] serf: Shutdown without a Leave
TestTokenCloneCommand - 2019/12/30 18:58:55.679650 [WARN] serf: Shutdown without a Leave
TestTokenCloneCommand - 2019/12/30 18:58:55.738024 [INFO] manager: shutting down
TestTokenCloneCommand - 2019/12/30 18:58:55.738599 [INFO] agent: consul server down
TestTokenCloneCommand - 2019/12/30 18:58:55.738655 [INFO] agent: shutdown complete
TestTokenCloneCommand - 2019/12/30 18:58:55.738709 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestTokenCloneCommand - 2019/12/30 18:58:55.738845 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestTokenCloneCommand - 2019/12/30 18:58:55.739007 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestTokenCloneCommand - 2019/12/30 18:58:55.739865 [INFO] agent: Waiting for endpoints to shut down
TestTokenCloneCommand - 2019/12/30 18:58:55.739982 [INFO] agent: Endpoints down
--- PASS: TestTokenCloneCommand (4.70s)
    --- PASS: TestTokenCloneCommand/Description (0.16s)
    --- PASS: TestTokenCloneCommand/Without_Description (0.35s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/clone	5.081s
=== RUN   TestTokenCreateCommand_noTabs
=== PAUSE TestTokenCreateCommand_noTabs
=== RUN   TestTokenCreateCommand
=== PAUSE TestTokenCreateCommand
=== CONT  TestTokenCreateCommand_noTabs
=== CONT  TestTokenCreateCommand
--- PASS: TestTokenCreateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenCreateCommand - 2019/12/30 18:59:10.055972 [WARN] agent: Node name "Node 6f528791-56ab-1374-db2e-cabd363a8fb9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenCreateCommand - 2019/12/30 18:59:10.056894 [DEBUG] tlsutil: Update with version 1
TestTokenCreateCommand - 2019/12/30 18:59:10.071641 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:59:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6f528791-56ab-1374-db2e-cabd363a8fb9 Address:127.0.0.1:19006}]
2019/12/30 18:59:10 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestTokenCreateCommand - 2019/12/30 18:59:10.827908 [INFO] serf: EventMemberJoin: Node 6f528791-56ab-1374-db2e-cabd363a8fb9.dc1 127.0.0.1
TestTokenCreateCommand - 2019/12/30 18:59:10.832311 [INFO] serf: EventMemberJoin: Node 6f528791-56ab-1374-db2e-cabd363a8fb9 127.0.0.1
TestTokenCreateCommand - 2019/12/30 18:59:10.840222 [INFO] consul: Adding LAN server Node 6f528791-56ab-1374-db2e-cabd363a8fb9 (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestTokenCreateCommand - 2019/12/30 18:59:10.840464 [INFO] consul: Handled member-join event for server "Node 6f528791-56ab-1374-db2e-cabd363a8fb9.dc1" in area "wan"
TestTokenCreateCommand - 2019/12/30 18:59:10.850572 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestTokenCreateCommand - 2019/12/30 18:59:10.850932 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestTokenCreateCommand - 2019/12/30 18:59:10.853681 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestTokenCreateCommand - 2019/12/30 18:59:10.853801 [INFO] agent: started state syncer
2019/12/30 18:59:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:59:10 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/12/30 18:59:11 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:59:11 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestTokenCreateCommand - 2019/12/30 18:59:11.305903 [INFO] consul: cluster leadership acquired
TestTokenCreateCommand - 2019/12/30 18:59:11.306527 [INFO] consul: New leader elected: Node 6f528791-56ab-1374-db2e-cabd363a8fb9
TestTokenCreateCommand - 2019/12/30 18:59:11.561694 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCreateCommand - 2019/12/30 18:59:11.570555 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCreateCommand - 2019/12/30 18:59:11.592391 [INFO] acl: initializing acls
TestTokenCreateCommand - 2019/12/30 18:59:11.755788 [INFO] consul: Created ACL 'global-management' policy
TestTokenCreateCommand - 2019/12/30 18:59:11.755879 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCreateCommand - 2019/12/30 18:59:11.758038 [INFO] acl: initializing acls
TestTokenCreateCommand - 2019/12/30 18:59:11.758172 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCreateCommand - 2019/12/30 18:59:11.914561 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCreateCommand - 2019/12/30 18:59:12.105434 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCreateCommand - 2019/12/30 18:59:12.406365 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCreateCommand - 2019/12/30 18:59:12.406688 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenCreateCommand - 2019/12/30 18:59:12.407071 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCreateCommand - 2019/12/30 18:59:12.407933 [INFO] serf: EventMemberUpdate: Node 6f528791-56ab-1374-db2e-cabd363a8fb9
TestTokenCreateCommand - 2019/12/30 18:59:12.408519 [INFO] serf: EventMemberUpdate: Node 6f528791-56ab-1374-db2e-cabd363a8fb9.dc1
TestTokenCreateCommand - 2019/12/30 18:59:12.408734 [INFO] serf: EventMemberUpdate: Node 6f528791-56ab-1374-db2e-cabd363a8fb9
TestTokenCreateCommand - 2019/12/30 18:59:12.410033 [INFO] serf: EventMemberUpdate: Node 6f528791-56ab-1374-db2e-cabd363a8fb9.dc1
TestTokenCreateCommand - 2019/12/30 18:59:13.240110 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenCreateCommand - 2019/12/30 18:59:13.241578 [DEBUG] consul: Skipping self join check for "Node 6f528791-56ab-1374-db2e-cabd363a8fb9" since the cluster is too small
TestTokenCreateCommand - 2019/12/30 18:59:13.241902 [INFO] consul: member 'Node 6f528791-56ab-1374-db2e-cabd363a8fb9' joined, marking health alive
TestTokenCreateCommand - 2019/12/30 18:59:13.436215 [DEBUG] consul: Skipping self join check for "Node 6f528791-56ab-1374-db2e-cabd363a8fb9" since the cluster is too small
TestTokenCreateCommand - 2019/12/30 18:59:13.437584 [DEBUG] consul: Skipping self join check for "Node 6f528791-56ab-1374-db2e-cabd363a8fb9" since the cluster is too small
TestTokenCreateCommand - 2019/12/30 18:59:13.641110 [DEBUG] http: Request PUT /v1/acl/policy (193.551199ms) from=127.0.0.1:57666
TestTokenCreateCommand - 2019/12/30 18:59:13.875469 [DEBUG] http: Request PUT /v1/acl/token (227.520778ms) from=127.0.0.1:57668
TestTokenCreateCommand - 2019/12/30 18:59:14.177844 [DEBUG] http: Request PUT /v1/acl/token (295.804278ms) from=127.0.0.1:57670
TestTokenCreateCommand - 2019/12/30 18:59:14.518432 [DEBUG] http: Request PUT /v1/acl/token (334.280311ms) from=127.0.0.1:57672
TestTokenCreateCommand - 2019/12/30 18:59:14.526301 [DEBUG] http: Request GET /v1/acl/token/3d852bb8-5153-4388-a3ca-8ca78661889f (2.325729ms) from=127.0.0.1:57674
TestTokenCreateCommand - 2019/12/30 18:59:14.529923 [INFO] agent: Requesting shutdown
TestTokenCreateCommand - 2019/12/30 18:59:14.530044 [INFO] consul: shutting down server
TestTokenCreateCommand - 2019/12/30 18:59:14.530097 [WARN] serf: Shutdown without a Leave
TestTokenCreateCommand - 2019/12/30 18:59:14.668255 [WARN] serf: Shutdown without a Leave
TestTokenCreateCommand - 2019/12/30 18:59:14.813676 [INFO] manager: shutting down
TestTokenCreateCommand - 2019/12/30 18:59:14.814221 [INFO] agent: consul server down
TestTokenCreateCommand - 2019/12/30 18:59:14.814291 [INFO] agent: shutdown complete
TestTokenCreateCommand - 2019/12/30 18:59:14.814346 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestTokenCreateCommand - 2019/12/30 18:59:14.814568 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestTokenCreateCommand - 2019/12/30 18:59:14.814747 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestTokenCreateCommand - 2019/12/30 18:59:14.816066 [INFO] agent: Waiting for endpoints to shut down
TestTokenCreateCommand - 2019/12/30 18:59:14.816389 [INFO] agent: Endpoints down
--- PASS: TestTokenCreateCommand (4.91s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/create	5.334s
=== RUN   TestTokenDeleteCommand_noTabs
=== PAUSE TestTokenDeleteCommand_noTabs
=== RUN   TestTokenDeleteCommand
=== PAUSE TestTokenDeleteCommand
=== CONT  TestTokenDeleteCommand_noTabs
=== CONT  TestTokenDeleteCommand
--- PASS: TestTokenDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenDeleteCommand - 2019/12/30 18:59:13.678757 [WARN] agent: Node name "Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenDeleteCommand - 2019/12/30 18:59:13.679678 [DEBUG] tlsutil: Update with version 1
TestTokenDeleteCommand - 2019/12/30 18:59:13.691278 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:59:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8c03c1e6-3e24-22ee-4325-acbf9f3f5f02 Address:127.0.0.1:50506}]
2019/12/30 18:59:14 [INFO]  raft: Node at 127.0.0.1:50506 [Follower] entering Follower state (Leader: "")
TestTokenDeleteCommand - 2019/12/30 18:59:14.969673 [INFO] serf: EventMemberJoin: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02.dc1 127.0.0.1
TestTokenDeleteCommand - 2019/12/30 18:59:14.974348 [INFO] serf: EventMemberJoin: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02 127.0.0.1
TestTokenDeleteCommand - 2019/12/30 18:59:14.976148 [INFO] consul: Adding LAN server Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02 (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestTokenDeleteCommand - 2019/12/30 18:59:14.976825 [INFO] consul: Handled member-join event for server "Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02.dc1" in area "wan"
TestTokenDeleteCommand - 2019/12/30 18:59:14.979182 [INFO] agent: Started DNS server 127.0.0.1:50501 (tcp)
TestTokenDeleteCommand - 2019/12/30 18:59:14.980521 [INFO] agent: Started DNS server 127.0.0.1:50501 (udp)
TestTokenDeleteCommand - 2019/12/30 18:59:14.983306 [INFO] agent: Started HTTP server on 127.0.0.1:50502 (tcp)
TestTokenDeleteCommand - 2019/12/30 18:59:14.983499 [INFO] agent: started state syncer
2019/12/30 18:59:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:59:15 [INFO]  raft: Node at 127.0.0.1:50506 [Candidate] entering Candidate state in term 2
2019/12/30 18:59:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:59:15 [INFO]  raft: Node at 127.0.0.1:50506 [Leader] entering Leader state
TestTokenDeleteCommand - 2019/12/30 18:59:15.505564 [INFO] consul: cluster leadership acquired
TestTokenDeleteCommand - 2019/12/30 18:59:15.506183 [INFO] consul: New leader elected: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02
TestTokenDeleteCommand - 2019/12/30 18:59:15.687343 [ERR] agent: failed to sync remote state: ACL not found
TestTokenDeleteCommand - 2019/12/30 18:59:15.728690 [INFO] acl: initializing acls
TestTokenDeleteCommand - 2019/12/30 18:59:15.830433 [INFO] acl: initializing acls
TestTokenDeleteCommand - 2019/12/30 18:59:16.156045 [INFO] consul: Created ACL 'global-management' policy
TestTokenDeleteCommand - 2019/12/30 18:59:16.156153 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenDeleteCommand - 2019/12/30 18:59:16.156468 [INFO] consul: Created ACL 'global-management' policy
TestTokenDeleteCommand - 2019/12/30 18:59:16.156526 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenDeleteCommand - 2019/12/30 18:59:16.399284 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenDeleteCommand - 2019/12/30 18:59:16.571730 [ERR] agent: failed to sync remote state: ACL not found
TestTokenDeleteCommand - 2019/12/30 18:59:16.606960 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenDeleteCommand - 2019/12/30 18:59:16.923617 [INFO] consul: Created ACL anonymous token from configuration
TestTokenDeleteCommand - 2019/12/30 18:59:16.924799 [INFO] serf: EventMemberUpdate: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02
TestTokenDeleteCommand - 2019/12/30 18:59:16.925484 [INFO] serf: EventMemberUpdate: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02.dc1
TestTokenDeleteCommand - 2019/12/30 18:59:16.927143 [INFO] consul: Created ACL anonymous token from configuration
TestTokenDeleteCommand - 2019/12/30 18:59:16.927717 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenDeleteCommand - 2019/12/30 18:59:16.928548 [INFO] serf: EventMemberUpdate: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02
TestTokenDeleteCommand - 2019/12/30 18:59:16.929175 [INFO] serf: EventMemberUpdate: Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02.dc1
TestTokenDeleteCommand - 2019/12/30 18:59:17.847584 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenDeleteCommand - 2019/12/30 18:59:17.848133 [DEBUG] consul: Skipping self join check for "Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02" since the cluster is too small
TestTokenDeleteCommand - 2019/12/30 18:59:17.848240 [INFO] consul: member 'Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02' joined, marking health alive
TestTokenDeleteCommand - 2019/12/30 18:59:18.032351 [DEBUG] consul: Skipping self join check for "Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02" since the cluster is too small
TestTokenDeleteCommand - 2019/12/30 18:59:18.032893 [DEBUG] consul: Skipping self join check for "Node 8c03c1e6-3e24-22ee-4325-acbf9f3f5f02" since the cluster is too small
TestTokenDeleteCommand - 2019/12/30 18:59:18.208193 [DEBUG] http: Request PUT /v1/acl/token (162.229023ms) from=127.0.0.1:48646
TestTokenDeleteCommand - 2019/12/30 18:59:18.399696 [DEBUG] http: Request DELETE /v1/acl/token/c4505c15-18d0-6a80-244b-932ab484d07f (185.555316ms) from=127.0.0.1:48648
TestTokenDeleteCommand - 2019/12/30 18:59:18.402590 [ERR] http: Request GET /v1/acl/token/c4505c15-18d0-6a80-244b-932ab484d07f, error: ACL not found from=127.0.0.1:48646
TestTokenDeleteCommand - 2019/12/30 18:59:18.403780 [DEBUG] http: Request GET /v1/acl/token/c4505c15-18d0-6a80-244b-932ab484d07f (1.642711ms) from=127.0.0.1:48646
TestTokenDeleteCommand - 2019/12/30 18:59:18.405514 [INFO] agent: Requesting shutdown
TestTokenDeleteCommand - 2019/12/30 18:59:18.405602 [INFO] consul: shutting down server
TestTokenDeleteCommand - 2019/12/30 18:59:18.405661 [WARN] serf: Shutdown without a Leave
TestTokenDeleteCommand - 2019/12/30 18:59:18.463421 [WARN] serf: Shutdown without a Leave
TestTokenDeleteCommand - 2019/12/30 18:59:18.523637 [INFO] manager: shutting down
TestTokenDeleteCommand - 2019/12/30 18:59:18.526505 [INFO] agent: consul server down
TestTokenDeleteCommand - 2019/12/30 18:59:18.526575 [INFO] agent: shutdown complete
TestTokenDeleteCommand - 2019/12/30 18:59:18.526630 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (tcp)
TestTokenDeleteCommand - 2019/12/30 18:59:18.526784 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (udp)
TestTokenDeleteCommand - 2019/12/30 18:59:18.526946 [INFO] agent: Stopping HTTP server 127.0.0.1:50502 (tcp)
TestTokenDeleteCommand - 2019/12/30 18:59:18.527372 [INFO] agent: Waiting for endpoints to shut down
TestTokenDeleteCommand - 2019/12/30 18:59:18.527450 [INFO] agent: Endpoints down
--- PASS: TestTokenDeleteCommand (4.94s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/delete	5.244s
=== RUN   TestTokenListCommand_noTabs
=== PAUSE TestTokenListCommand_noTabs
=== RUN   TestTokenListCommand
=== PAUSE TestTokenListCommand
=== CONT  TestTokenListCommand_noTabs
--- PASS: TestTokenListCommand_noTabs (0.00s)
=== CONT  TestTokenListCommand
WARNING: bootstrap = true: do not enable unless necessary
TestTokenListCommand - 2019/12/30 18:59:34.337443 [WARN] agent: Node name "Node 93b04f0c-5590-203d-73e7-4b01b805baca" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenListCommand - 2019/12/30 18:59:34.338554 [DEBUG] tlsutil: Update with version 1
TestTokenListCommand - 2019/12/30 18:59:34.345473 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:59:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:93b04f0c-5590-203d-73e7-4b01b805baca Address:127.0.0.1:46006}]
TestTokenListCommand - 2019/12/30 18:59:35.269237 [INFO] serf: EventMemberJoin: Node 93b04f0c-5590-203d-73e7-4b01b805baca.dc1 127.0.0.1
TestTokenListCommand - 2019/12/30 18:59:35.272681 [INFO] serf: EventMemberJoin: Node 93b04f0c-5590-203d-73e7-4b01b805baca 127.0.0.1
2019/12/30 18:59:35 [INFO]  raft: Node at 127.0.0.1:46006 [Follower] entering Follower state (Leader: "")
TestTokenListCommand - 2019/12/30 18:59:35.277273 [INFO] consul: Handled member-join event for server "Node 93b04f0c-5590-203d-73e7-4b01b805baca.dc1" in area "wan"
TestTokenListCommand - 2019/12/30 18:59:35.277817 [INFO] agent: Started DNS server 127.0.0.1:46001 (tcp)
TestTokenListCommand - 2019/12/30 18:59:35.278780 [INFO] consul: Adding LAN server Node 93b04f0c-5590-203d-73e7-4b01b805baca (Addr: tcp/127.0.0.1:46006) (DC: dc1)
TestTokenListCommand - 2019/12/30 18:59:35.280019 [INFO] agent: Started DNS server 127.0.0.1:46001 (udp)
TestTokenListCommand - 2019/12/30 18:59:35.285142 [INFO] agent: Started HTTP server on 127.0.0.1:46002 (tcp)
TestTokenListCommand - 2019/12/30 18:59:35.286130 [INFO] agent: started state syncer
2019/12/30 18:59:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:59:35 [INFO]  raft: Node at 127.0.0.1:46006 [Candidate] entering Candidate state in term 2
2019/12/30 18:59:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:59:35 [INFO]  raft: Node at 127.0.0.1:46006 [Leader] entering Leader state
TestTokenListCommand - 2019/12/30 18:59:35.739841 [INFO] consul: cluster leadership acquired
TestTokenListCommand - 2019/12/30 18:59:35.740391 [INFO] consul: New leader elected: Node 93b04f0c-5590-203d-73e7-4b01b805baca
TestTokenListCommand - 2019/12/30 18:59:35.769219 [ERR] agent: failed to sync remote state: ACL not found
TestTokenListCommand - 2019/12/30 18:59:36.028741 [INFO] acl: initializing acls
TestTokenListCommand - 2019/12/30 18:59:36.056529 [INFO] acl: initializing acls
TestTokenListCommand - 2019/12/30 18:59:36.267731 [INFO] consul: Created ACL 'global-management' policy
TestTokenListCommand - 2019/12/30 18:59:36.267810 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenListCommand - 2019/12/30 18:59:36.268056 [INFO] consul: Created ACL 'global-management' policy
TestTokenListCommand - 2019/12/30 18:59:36.268108 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenListCommand - 2019/12/30 18:59:36.490666 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenListCommand - 2019/12/30 18:59:36.797869 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenListCommand - 2019/12/30 18:59:36.798628 [INFO] consul: Created ACL anonymous token from configuration
TestTokenListCommand - 2019/12/30 18:59:36.798750 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenListCommand - 2019/12/30 18:59:36.799767 [INFO] serf: EventMemberUpdate: Node 93b04f0c-5590-203d-73e7-4b01b805baca
TestTokenListCommand - 2019/12/30 18:59:36.800545 [INFO] serf: EventMemberUpdate: Node 93b04f0c-5590-203d-73e7-4b01b805baca.dc1
TestTokenListCommand - 2019/12/30 18:59:36.957574 [INFO] consul: Created ACL anonymous token from configuration
TestTokenListCommand - 2019/12/30 18:59:36.959105 [INFO] serf: EventMemberUpdate: Node 93b04f0c-5590-203d-73e7-4b01b805baca
TestTokenListCommand - 2019/12/30 18:59:36.960464 [INFO] serf: EventMemberUpdate: Node 93b04f0c-5590-203d-73e7-4b01b805baca.dc1
TestTokenListCommand - 2019/12/30 18:59:37.964723 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenListCommand - 2019/12/30 18:59:37.965219 [INFO] agent: Synced node info
TestTokenListCommand - 2019/12/30 18:59:37.965331 [DEBUG] agent: Node info in sync
TestTokenListCommand - 2019/12/30 18:59:37.965235 [DEBUG] consul: Skipping self join check for "Node 93b04f0c-5590-203d-73e7-4b01b805baca" since the cluster is too small
TestTokenListCommand - 2019/12/30 18:59:37.965496 [INFO] consul: member 'Node 93b04f0c-5590-203d-73e7-4b01b805baca' joined, marking health alive
TestTokenListCommand - 2019/12/30 18:59:38.307857 [DEBUG] consul: Skipping self join check for "Node 93b04f0c-5590-203d-73e7-4b01b805baca" since the cluster is too small
TestTokenListCommand - 2019/12/30 18:59:38.308341 [DEBUG] consul: Skipping self join check for "Node 93b04f0c-5590-203d-73e7-4b01b805baca" since the cluster is too small
TestTokenListCommand - 2019/12/30 18:59:38.310248 [DEBUG] http: Request PUT /v1/acl/token (328.016801ms) from=127.0.0.1:50906
TestTokenListCommand - 2019/12/30 18:59:38.460764 [DEBUG] http: Request PUT /v1/acl/token (146.563599ms) from=127.0.0.1:50906
TestTokenListCommand - 2019/12/30 18:59:38.632691 [DEBUG] http: Request PUT /v1/acl/token (162.580362ms) from=127.0.0.1:50906
TestTokenListCommand - 2019/12/30 18:59:38.782882 [DEBUG] http: Request PUT /v1/acl/token (145.904248ms) from=127.0.0.1:50906
TestTokenListCommand - 2019/12/30 18:59:38.932719 [DEBUG] http: Request PUT /v1/acl/token (146.986944ms) from=127.0.0.1:50906
TestTokenListCommand - 2019/12/30 18:59:38.938078 [DEBUG] http: Request GET /v1/acl/tokens (1.963719ms) from=127.0.0.1:50908
TestTokenListCommand - 2019/12/30 18:59:38.942626 [INFO] agent: Requesting shutdown
TestTokenListCommand - 2019/12/30 18:59:38.942753 [INFO] consul: shutting down server
TestTokenListCommand - 2019/12/30 18:59:38.942803 [WARN] serf: Shutdown without a Leave
TestTokenListCommand - 2019/12/30 18:59:38.997209 [WARN] serf: Shutdown without a Leave
TestTokenListCommand - 2019/12/30 18:59:39.056151 [INFO] manager: shutting down
TestTokenListCommand - 2019/12/30 18:59:39.056567 [INFO] agent: consul server down
TestTokenListCommand - 2019/12/30 18:59:39.056625 [INFO] agent: shutdown complete
TestTokenListCommand - 2019/12/30 18:59:39.056681 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (tcp)
TestTokenListCommand - 2019/12/30 18:59:39.056814 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (udp)
TestTokenListCommand - 2019/12/30 18:59:39.056964 [INFO] agent: Stopping HTTP server 127.0.0.1:46002 (tcp)
TestTokenListCommand - 2019/12/30 18:59:39.060128 [INFO] agent: Waiting for endpoints to shut down
TestTokenListCommand - 2019/12/30 18:59:39.060300 [INFO] agent: Endpoints down
--- PASS: TestTokenListCommand (4.81s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/list	5.146s
=== RUN   TestTokenReadCommand_noTabs
=== PAUSE TestTokenReadCommand_noTabs
=== RUN   TestTokenReadCommand
=== PAUSE TestTokenReadCommand
=== CONT  TestTokenReadCommand_noTabs
=== CONT  TestTokenReadCommand
--- PASS: TestTokenReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenReadCommand - 2019/12/30 18:59:42.623876 [WARN] agent: Node name "Node 3e676c01-e918-cebd-8bcc-139fc59b3063" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenReadCommand - 2019/12/30 18:59:42.624933 [DEBUG] tlsutil: Update with version 1
TestTokenReadCommand - 2019/12/30 18:59:42.631405 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 18:59:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3e676c01-e918-cebd-8bcc-139fc59b3063 Address:127.0.0.1:14506}]
2019/12/30 18:59:43 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestTokenReadCommand - 2019/12/30 18:59:43.394581 [INFO] serf: EventMemberJoin: Node 3e676c01-e918-cebd-8bcc-139fc59b3063.dc1 127.0.0.1
TestTokenReadCommand - 2019/12/30 18:59:43.398171 [INFO] serf: EventMemberJoin: Node 3e676c01-e918-cebd-8bcc-139fc59b3063 127.0.0.1
TestTokenReadCommand - 2019/12/30 18:59:43.399707 [INFO] consul: Handled member-join event for server "Node 3e676c01-e918-cebd-8bcc-139fc59b3063.dc1" in area "wan"
TestTokenReadCommand - 2019/12/30 18:59:43.400128 [INFO] consul: Adding LAN server Node 3e676c01-e918-cebd-8bcc-139fc59b3063 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestTokenReadCommand - 2019/12/30 18:59:43.400712 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestTokenReadCommand - 2019/12/30 18:59:43.410446 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestTokenReadCommand - 2019/12/30 18:59:43.413370 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestTokenReadCommand - 2019/12/30 18:59:43.413525 [INFO] agent: started state syncer
2019/12/30 18:59:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 18:59:43 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/30 18:59:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 18:59:43 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestTokenReadCommand - 2019/12/30 18:59:43.931259 [INFO] consul: cluster leadership acquired
TestTokenReadCommand - 2019/12/30 18:59:43.931877 [INFO] consul: New leader elected: Node 3e676c01-e918-cebd-8bcc-139fc59b3063
TestTokenReadCommand - 2019/12/30 18:59:44.152094 [INFO] acl: initializing acls
TestTokenReadCommand - 2019/12/30 18:59:44.280694 [ERR] agent: failed to sync remote state: ACL not found
TestTokenReadCommand - 2019/12/30 18:59:44.431123 [INFO] acl: initializing acls
TestTokenReadCommand - 2019/12/30 18:59:44.431591 [INFO] consul: Created ACL 'global-management' policy
TestTokenReadCommand - 2019/12/30 18:59:44.431656 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenReadCommand - 2019/12/30 18:59:44.589921 [INFO] consul: Created ACL 'global-management' policy
TestTokenReadCommand - 2019/12/30 18:59:44.590017 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenReadCommand - 2019/12/30 18:59:45.192825 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenReadCommand - 2019/12/30 18:59:45.193314 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenReadCommand - 2019/12/30 18:59:45.508267 [INFO] consul: Created ACL anonymous token from configuration
TestTokenReadCommand - 2019/12/30 18:59:45.509326 [INFO] serf: EventMemberUpdate: Node 3e676c01-e918-cebd-8bcc-139fc59b3063
TestTokenReadCommand - 2019/12/30 18:59:45.510042 [INFO] serf: EventMemberUpdate: Node 3e676c01-e918-cebd-8bcc-139fc59b3063.dc1
TestTokenReadCommand - 2019/12/30 18:59:45.806902 [INFO] consul: Created ACL anonymous token from configuration
TestTokenReadCommand - 2019/12/30 18:59:45.807013 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenReadCommand - 2019/12/30 18:59:45.808007 [INFO] serf: EventMemberUpdate: Node 3e676c01-e918-cebd-8bcc-139fc59b3063
TestTokenReadCommand - 2019/12/30 18:59:45.808704 [INFO] serf: EventMemberUpdate: Node 3e676c01-e918-cebd-8bcc-139fc59b3063.dc1
TestTokenReadCommand - 2019/12/30 18:59:46.074746 [INFO] agent: Synced node info
TestTokenReadCommand - 2019/12/30 18:59:46.078663 [DEBUG] agent: Node info in sync
TestTokenReadCommand - 2019/12/30 18:59:46.604911 [DEBUG] http: Request PUT /v1/acl/token (503.108159ms) from=127.0.0.1:48504
TestTokenReadCommand - 2019/12/30 18:59:46.612949 [DEBUG] http: Request GET /v1/acl/token/ba2de9c0-985e-4906-c029-71086d06945c (1.840716ms) from=127.0.0.1:48506
TestTokenReadCommand - 2019/12/30 18:59:46.616050 [INFO] agent: Requesting shutdown
TestTokenReadCommand - 2019/12/30 18:59:46.616373 [INFO] consul: shutting down server
TestTokenReadCommand - 2019/12/30 18:59:46.616561 [WARN] serf: Shutdown without a Leave
TestTokenReadCommand - 2019/12/30 18:59:46.780909 [WARN] serf: Shutdown without a Leave
TestTokenReadCommand - 2019/12/30 18:59:46.855862 [INFO] manager: shutting down
TestTokenReadCommand - 2019/12/30 18:59:46.856568 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestTokenReadCommand - 2019/12/30 18:59:46.857014 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestTokenReadCommand - 2019/12/30 18:59:46.857716 [INFO] agent: consul server down
TestTokenReadCommand - 2019/12/30 18:59:46.858032 [INFO] agent: shutdown complete
TestTokenReadCommand - 2019/12/30 18:59:46.858536 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestTokenReadCommand - 2019/12/30 18:59:46.859062 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestTokenReadCommand - 2019/12/30 18:59:46.859681 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestTokenReadCommand - 2019/12/30 18:59:46.861045 [INFO] agent: Waiting for endpoints to shut down
TestTokenReadCommand - 2019/12/30 18:59:46.861540 [INFO] agent: Endpoints down
--- PASS: TestTokenReadCommand (4.40s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/read	4.921s
=== RUN   TestTokenUpdateCommand_noTabs
=== PAUSE TestTokenUpdateCommand_noTabs
=== RUN   TestTokenUpdateCommand
=== PAUSE TestTokenUpdateCommand
=== CONT  TestTokenUpdateCommand_noTabs
=== CONT  TestTokenUpdateCommand
--- PASS: TestTokenUpdateCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenUpdateCommand - 2019/12/30 19:00:02.722224 [WARN] agent: Node name "Node 81bbc640-0110-9c6a-24f1-af28535d44b1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenUpdateCommand - 2019/12/30 19:00:02.723375 [DEBUG] tlsutil: Update with version 1
TestTokenUpdateCommand - 2019/12/30 19:00:02.730891 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:00:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:81bbc640-0110-9c6a-24f1-af28535d44b1 Address:127.0.0.1:22006}]
2019/12/30 19:00:03 [INFO]  raft: Node at 127.0.0.1:22006 [Follower] entering Follower state (Leader: "")
TestTokenUpdateCommand - 2019/12/30 19:00:03.506191 [INFO] serf: EventMemberJoin: Node 81bbc640-0110-9c6a-24f1-af28535d44b1.dc1 127.0.0.1
TestTokenUpdateCommand - 2019/12/30 19:00:03.509489 [INFO] serf: EventMemberJoin: Node 81bbc640-0110-9c6a-24f1-af28535d44b1 127.0.0.1
TestTokenUpdateCommand - 2019/12/30 19:00:03.528432 [INFO] agent: Started DNS server 127.0.0.1:22001 (udp)
TestTokenUpdateCommand - 2019/12/30 19:00:03.530752 [INFO] consul: Handled member-join event for server "Node 81bbc640-0110-9c6a-24f1-af28535d44b1.dc1" in area "wan"
TestTokenUpdateCommand - 2019/12/30 19:00:03.545620 [INFO] agent: Started DNS server 127.0.0.1:22001 (tcp)
TestTokenUpdateCommand - 2019/12/30 19:00:03.549251 [INFO] consul: Adding LAN server Node 81bbc640-0110-9c6a-24f1-af28535d44b1 (Addr: tcp/127.0.0.1:22006) (DC: dc1)
TestTokenUpdateCommand - 2019/12/30 19:00:03.551781 [INFO] agent: Started HTTP server on 127.0.0.1:22002 (tcp)
TestTokenUpdateCommand - 2019/12/30 19:00:03.552155 [INFO] agent: started state syncer
2019/12/30 19:00:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:00:03 [INFO]  raft: Node at 127.0.0.1:22006 [Candidate] entering Candidate state in term 2
2019/12/30 19:00:03 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:00:03 [INFO]  raft: Node at 127.0.0.1:22006 [Leader] entering Leader state
TestTokenUpdateCommand - 2019/12/30 19:00:03.982008 [ERR] agent: failed to sync remote state: ACL not found
TestTokenUpdateCommand - 2019/12/30 19:00:03.982342 [INFO] consul: cluster leadership acquired
TestTokenUpdateCommand - 2019/12/30 19:00:03.982907 [INFO] consul: New leader elected: Node 81bbc640-0110-9c6a-24f1-af28535d44b1
TestTokenUpdateCommand - 2019/12/30 19:00:04.281804 [INFO] acl: initializing acls
TestTokenUpdateCommand - 2019/12/30 19:00:04.284613 [INFO] acl: initializing acls
TestTokenUpdateCommand - 2019/12/30 19:00:04.448841 [INFO] consul: Created ACL 'global-management' policy
TestTokenUpdateCommand - 2019/12/30 19:00:04.448939 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenUpdateCommand - 2019/12/30 19:00:04.615393 [INFO] consul: Created ACL 'global-management' policy
TestTokenUpdateCommand - 2019/12/30 19:00:04.615486 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenUpdateCommand - 2019/12/30 19:00:04.832853 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenUpdateCommand - 2019/12/30 19:00:05.124077 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenUpdateCommand - 2019/12/30 19:00:05.124924 [INFO] consul: Created ACL anonymous token from configuration
TestTokenUpdateCommand - 2019/12/30 19:00:05.125958 [INFO] serf: EventMemberUpdate: Node 81bbc640-0110-9c6a-24f1-af28535d44b1
TestTokenUpdateCommand - 2019/12/30 19:00:05.126653 [INFO] serf: EventMemberUpdate: Node 81bbc640-0110-9c6a-24f1-af28535d44b1.dc1
TestTokenUpdateCommand - 2019/12/30 19:00:05.291696 [INFO] consul: Created ACL anonymous token from configuration
TestTokenUpdateCommand - 2019/12/30 19:00:05.291781 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenUpdateCommand - 2019/12/30 19:00:05.292686 [INFO] serf: EventMemberUpdate: Node 81bbc640-0110-9c6a-24f1-af28535d44b1
TestTokenUpdateCommand - 2019/12/30 19:00:05.293390 [INFO] serf: EventMemberUpdate: Node 81bbc640-0110-9c6a-24f1-af28535d44b1.dc1
TestTokenUpdateCommand - 2019/12/30 19:00:06.782241 [INFO] agent: Synced node info
TestTokenUpdateCommand - 2019/12/30 19:00:06.782405 [DEBUG] agent: Node info in sync
TestTokenUpdateCommand - 2019/12/30 19:00:08.037409 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenUpdateCommand - 2019/12/30 19:00:08.040159 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenUpdateCommand - 2019/12/30 19:00:08.040677 [DEBUG] consul: Skipping self join check for "Node 81bbc640-0110-9c6a-24f1-af28535d44b1" since the cluster is too small
TestTokenUpdateCommand - 2019/12/30 19:00:08.040862 [INFO] consul: member 'Node 81bbc640-0110-9c6a-24f1-af28535d44b1' joined, marking health alive
TestTokenUpdateCommand - 2019/12/30 19:00:08.048419 [DEBUG] http: Request PUT /v1/acl/policy (1.232682696s) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:08.558647 [DEBUG] consul: Skipping self join check for "Node 81bbc640-0110-9c6a-24f1-af28535d44b1" since the cluster is too small
TestTokenUpdateCommand - 2019/12/30 19:00:08.559149 [DEBUG] consul: Skipping self join check for "Node 81bbc640-0110-9c6a-24f1-af28535d44b1" since the cluster is too small
TestTokenUpdateCommand - 2019/12/30 19:00:08.560745 [DEBUG] http: Request PUT /v1/acl/token (480.878551ms) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:08.741623 [DEBUG] http: Request PUT /v1/acl/create (176.54973ms) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:08.750784 [DEBUG] http: Request GET /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (1.552709ms) from=127.0.0.1:34168
TestTokenUpdateCommand - 2019/12/30 19:00:08.941860 [DEBUG] http: Request PUT /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (188.291711ms) from=127.0.0.1:34168
TestTokenUpdateCommand - 2019/12/30 19:00:08.946248 [DEBUG] http: Request GET /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (1.10503ms) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:08.972930 [DEBUG] http: Request GET /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (19.786863ms) from=127.0.0.1:34170
TestTokenUpdateCommand - 2019/12/30 19:00:09.166958 [DEBUG] http: Request PUT /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (184.532611ms) from=127.0.0.1:34170
TestTokenUpdateCommand - 2019/12/30 19:00:09.171173 [DEBUG] http: Request GET /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (1.091029ms) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.180311 [DEBUG] http: Request GET /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (2.161058ms) from=127.0.0.1:34172
TestTokenUpdateCommand - 2019/12/30 19:00:09.375167 [DEBUG] http: Request PUT /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (189.796419ms) from=127.0.0.1:34172
TestTokenUpdateCommand - 2019/12/30 19:00:09.379150 [DEBUG] http: Request GET /v1/acl/token/0022f326-01f0-d355-c37a-337251f0205e (973.692µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.383547 [DEBUG] http: Request GET /v1/acl/token/self (769.687µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.413043 [DEBUG] http: Request GET /v1/acl/token/self (857.023µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.442115 [DEBUG] http: Request GET /v1/acl/token/self (829.022µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.471357 [DEBUG] http: Request GET /v1/acl/token/self (731.686µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.500590 [DEBUG] http: Request GET /v1/acl/token/self (857.023µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.529956 [DEBUG] http: Request GET /v1/acl/token/self (887.358µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.558946 [DEBUG] http: Request GET /v1/acl/token/self (752.354µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.589888 [DEBUG] http: Request GET /v1/acl/token/self (760.687µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.618813 [DEBUG] http: Request GET /v1/acl/token/self (647.351µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.648123 [DEBUG] http: Request GET /v1/acl/token/self (886.357µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.677520 [DEBUG] http: Request GET /v1/acl/token/self (781.354µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.707049 [DEBUG] http: Request GET /v1/acl/token/self (762.687µs) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.736883 [DEBUG] http: Request GET /v1/acl/token/self (1.041361ms) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.750012 [DEBUG] http: Request GET /v1/acl/token/9303ee8c-5729-85ca-6302-60421d09c9cd (1.47804ms) from=127.0.0.1:34174
TestTokenUpdateCommand - 2019/12/30 19:00:09.916839 [DEBUG] http: Request PUT /v1/acl/token/9303ee8c-5729-85ca-6302-60421d09c9cd (163.738054ms) from=127.0.0.1:34174
TestTokenUpdateCommand - 2019/12/30 19:00:09.921531 [DEBUG] http: Request GET /v1/acl/token/9303ee8c-5729-85ca-6302-60421d09c9cd (1.067696ms) from=127.0.0.1:34162
TestTokenUpdateCommand - 2019/12/30 19:00:09.923831 [INFO] agent: Requesting shutdown
TestTokenUpdateCommand - 2019/12/30 19:00:09.923957 [INFO] consul: shutting down server
TestTokenUpdateCommand - 2019/12/30 19:00:09.924085 [WARN] serf: Shutdown without a Leave
TestTokenUpdateCommand - 2019/12/30 19:00:09.989696 [WARN] serf: Shutdown without a Leave
TestTokenUpdateCommand - 2019/12/30 19:00:10.056316 [INFO] manager: shutting down
TestTokenUpdateCommand - 2019/12/30 19:00:10.057170 [INFO] agent: consul server down
TestTokenUpdateCommand - 2019/12/30 19:00:10.057235 [INFO] agent: shutdown complete
TestTokenUpdateCommand - 2019/12/30 19:00:10.057296 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (tcp)
TestTokenUpdateCommand - 2019/12/30 19:00:10.057459 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (udp)
TestTokenUpdateCommand - 2019/12/30 19:00:10.057640 [INFO] agent: Stopping HTTP server 127.0.0.1:22002 (tcp)
TestTokenUpdateCommand - 2019/12/30 19:00:10.058804 [INFO] agent: Waiting for endpoints to shut down
TestTokenUpdateCommand - 2019/12/30 19:00:10.058904 [INFO] agent: Endpoints down
--- PASS: TestTokenUpdateCommand (7.45s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/update	7.700s
=== RUN   TestConfigFail
=== PAUSE TestConfigFail
=== RUN   TestRetryJoin
--- SKIP: TestRetryJoin (0.00s)
    agent_test.go:85: DM-skipped
=== RUN   TestRetryJoinFail
=== PAUSE TestRetryJoinFail
=== RUN   TestRetryJoinWanFail
=== PAUSE TestRetryJoinWanFail
=== RUN   TestProtectDataDir
=== PAUSE TestProtectDataDir
=== RUN   TestBadDataDirPermissions
=== PAUSE TestBadDataDirPermissions
=== CONT  TestConfigFail
=== CONT  TestRetryJoinWanFail
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=
=== CONT  TestBadDataDirPermissions
=== CONT  TestRetryJoinFail
--- PASS: TestBadDataDirPermissions (0.13s)
=== CONT  TestProtectDataDir
--- PASS: TestProtectDataDir (0.11s)
--- PASS: TestRetryJoinFail (0.51s)
--- PASS: TestRetryJoinWanFail (2.44s)
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=foo_some-other-arg
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise_0.0.0.0_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise_::_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise_[::]_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise-wan_0.0.0.0_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise-wan_::_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise-wan_[::]_-bind_10.0.0.1
--- PASS: TestConfigFail (9.24s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter= (2.73s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=foo_some-other-arg (0.85s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1 (0.89s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise_0.0.0.0_-bind_10.0.0.1 (0.95s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise_::_-bind_10.0.0.1 (0.96s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise_[::]_-bind_10.0.0.1 (0.84s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise-wan_0.0.0.0_-bind_10.0.0.1 (0.81s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise-wan_::_-bind_10.0.0.1 (0.61s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul254957165_-advertise-wan_[::]_-bind_10.0.0.1 (0.61s)
PASS
ok  	github.com/hashicorp/consul/command/agent	9.494s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/catalog	0.114s
=== RUN   TestCatalogListDatacentersCommand_noTabs
=== PAUSE TestCatalogListDatacentersCommand_noTabs
=== RUN   TestCatalogListDatacentersCommand_Validation
=== PAUSE TestCatalogListDatacentersCommand_Validation
=== RUN   TestCatalogListDatacentersCommand
=== PAUSE TestCatalogListDatacentersCommand
=== CONT  TestCatalogListDatacentersCommand_noTabs
=== CONT  TestCatalogListDatacentersCommand
=== CONT  TestCatalogListDatacentersCommand_Validation
--- PASS: TestCatalogListDatacentersCommand_Validation (0.00s)
--- PASS: TestCatalogListDatacentersCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListDatacentersCommand - 2019/12/30 19:00:35.185618 [WARN] agent: Node name "Node 17981ad2-8788-838e-ba2a-ff87774fd177" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListDatacentersCommand - 2019/12/30 19:00:35.186517 [DEBUG] tlsutil: Update with version 1
TestCatalogListDatacentersCommand - 2019/12/30 19:00:35.196864 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:00:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:17981ad2-8788-838e-ba2a-ff87774fd177 Address:127.0.0.1:14506}]
2019/12/30 19:00:36 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.420236 [INFO] serf: EventMemberJoin: Node 17981ad2-8788-838e-ba2a-ff87774fd177.dc1 127.0.0.1
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.425725 [INFO] serf: EventMemberJoin: Node 17981ad2-8788-838e-ba2a-ff87774fd177 127.0.0.1
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.427137 [INFO] consul: Adding LAN server Node 17981ad2-8788-838e-ba2a-ff87774fd177 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.427642 [INFO] consul: Handled member-join event for server "Node 17981ad2-8788-838e-ba2a-ff87774fd177.dc1" in area "wan"
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.428650 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.431715 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.434842 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:36.435011 [INFO] agent: started state syncer
2019/12/30 19:00:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:00:36 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/30 19:00:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:00:37 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.059243 [INFO] consul: cluster leadership acquired
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.059985 [INFO] consul: New leader elected: Node 17981ad2-8788-838e-ba2a-ff87774fd177
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.093399 [DEBUG] http: Request GET /v1/catalog/datacenters (4.934466ms) from=127.0.0.1:48526
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.105492 [INFO] agent: Requesting shutdown
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.105588 [INFO] consul: shutting down server
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.105644 [WARN] serf: Shutdown without a Leave
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.105923 [ERR] agent: failed to sync remote state: No cluster leader
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.240212 [WARN] serf: Shutdown without a Leave
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.323655 [INFO] manager: shutting down
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.423615 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.423894 [INFO] agent: consul server down
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.423944 [INFO] agent: shutdown complete
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.423997 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.424126 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.424272 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.424840 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListDatacentersCommand - 2019/12/30 19:00:37.424929 [INFO] agent: Endpoints down
--- PASS: TestCatalogListDatacentersCommand (2.31s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/dc	2.792s
=== RUN   TestCatalogListNodesCommand_noTabs
=== PAUSE TestCatalogListNodesCommand_noTabs
=== RUN   TestCatalogListNodesCommand_Validation
=== PAUSE TestCatalogListNodesCommand_Validation
=== RUN   TestCatalogListNodesCommand
=== PAUSE TestCatalogListNodesCommand
=== CONT  TestCatalogListNodesCommand_noTabs
=== CONT  TestCatalogListNodesCommand_Validation
--- PASS: TestCatalogListNodesCommand_noTabs (0.00s)
--- PASS: TestCatalogListNodesCommand_Validation (0.00s)
=== CONT  TestCatalogListNodesCommand
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListNodesCommand - 2019/12/30 19:00:36.015452 [WARN] agent: Node name "Node eb979f57-7377-00fb-8558-b34f4a2a2e47" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListNodesCommand - 2019/12/30 19:00:36.017338 [DEBUG] tlsutil: Update with version 1
TestCatalogListNodesCommand - 2019/12/30 19:00:36.024315 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:00:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb979f57-7377-00fb-8558-b34f4a2a2e47 Address:127.0.0.1:23506}]
2019/12/30 19:00:37 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestCatalogListNodesCommand - 2019/12/30 19:00:37.176158 [INFO] serf: EventMemberJoin: Node eb979f57-7377-00fb-8558-b34f4a2a2e47.dc1 127.0.0.1
TestCatalogListNodesCommand - 2019/12/30 19:00:37.184759 [INFO] serf: EventMemberJoin: Node eb979f57-7377-00fb-8558-b34f4a2a2e47 127.0.0.1
TestCatalogListNodesCommand - 2019/12/30 19:00:37.191783 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestCatalogListNodesCommand - 2019/12/30 19:00:37.192589 [INFO] consul: Adding LAN server Node eb979f57-7377-00fb-8558-b34f4a2a2e47 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestCatalogListNodesCommand - 2019/12/30 19:00:37.194669 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestCatalogListNodesCommand - 2019/12/30 19:00:37.194682 [INFO] consul: Handled member-join event for server "Node eb979f57-7377-00fb-8558-b34f4a2a2e47.dc1" in area "wan"
TestCatalogListNodesCommand - 2019/12/30 19:00:37.197748 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestCatalogListNodesCommand - 2019/12/30 19:00:37.197905 [INFO] agent: started state syncer
2019/12/30 19:00:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:00:37 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 19:00:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:00:37 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestCatalogListNodesCommand - 2019/12/30 19:00:37.836810 [INFO] consul: cluster leadership acquired
TestCatalogListNodesCommand - 2019/12/30 19:00:37.837450 [INFO] consul: New leader elected: Node eb979f57-7377-00fb-8558-b34f4a2a2e47
TestCatalogListNodesCommand - 2019/12/30 19:00:38.174630 [INFO] agent: Synced node info
TestCatalogListNodesCommand - 2019/12/30 19:00:38.174780 [DEBUG] agent: Node info in sync
TestCatalogListNodesCommand - 2019/12/30 19:00:39.107608 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogListNodesCommand - 2019/12/30 19:00:39.108089 [DEBUG] consul: Skipping self join check for "Node eb979f57-7377-00fb-8558-b34f4a2a2e47" since the cluster is too small
TestCatalogListNodesCommand - 2019/12/30 19:00:39.108241 [INFO] consul: member 'Node eb979f57-7377-00fb-8558-b34f4a2a2e47' joined, marking health alive
=== RUN   TestCatalogListNodesCommand/simple
TestCatalogListNodesCommand - 2019/12/30 19:00:39.377777 [DEBUG] http: Request GET /v1/catalog/nodes (1.913718ms) from=127.0.0.1:37810
=== RUN   TestCatalogListNodesCommand/detailed
TestCatalogListNodesCommand - 2019/12/30 19:00:39.391341 [DEBUG] http: Request GET /v1/catalog/nodes (1.445705ms) from=127.0.0.1:37812
=== RUN   TestCatalogListNodesCommand/node-meta
TestCatalogListNodesCommand - 2019/12/30 19:00:39.402725 [DEBUG] http: Request GET /v1/catalog/nodes?node-meta=foo%3Abar (970.026µs) from=127.0.0.1:37814
=== RUN   TestCatalogListNodesCommand/filter
TestCatalogListNodesCommand - 2019/12/30 19:00:39.412819 [DEBUG] http: Request GET /v1/catalog/nodes?filter=Meta.foo+%3D%3D+bar (3.562429ms) from=127.0.0.1:37816
=== RUN   TestCatalogListNodesCommand/near
TestCatalogListNodesCommand - 2019/12/30 19:00:39.437607 [DEBUG] http: Request GET /v1/catalog/nodes?near=_agent (1.658378ms) from=127.0.0.1:37818
=== RUN   TestCatalogListNodesCommand/service_present
TestCatalogListNodesCommand - 2019/12/30 19:00:39.493426 [DEBUG] http: Request GET /v1/catalog/service/consul (42.534805ms) from=127.0.0.1:37820
=== RUN   TestCatalogListNodesCommand/service_missing
TestCatalogListNodesCommand - 2019/12/30 19:00:39.513148 [DEBUG] http: Request GET /v1/catalog/service/this-service-will-literally-never-exist (1.389037ms) from=127.0.0.1:37822
TestCatalogListNodesCommand - 2019/12/30 19:00:39.515065 [INFO] agent: Requesting shutdown
TestCatalogListNodesCommand - 2019/12/30 19:00:39.515159 [INFO] consul: shutting down server
TestCatalogListNodesCommand - 2019/12/30 19:00:39.515205 [WARN] serf: Shutdown without a Leave
TestCatalogListNodesCommand - 2019/12/30 19:00:39.664626 [WARN] serf: Shutdown without a Leave
TestCatalogListNodesCommand - 2019/12/30 19:00:39.748735 [INFO] manager: shutting down
TestCatalogListNodesCommand - 2019/12/30 19:00:39.749340 [INFO] agent: consul server down
TestCatalogListNodesCommand - 2019/12/30 19:00:39.749445 [INFO] agent: shutdown complete
TestCatalogListNodesCommand - 2019/12/30 19:00:39.749505 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestCatalogListNodesCommand - 2019/12/30 19:00:39.749650 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestCatalogListNodesCommand - 2019/12/30 19:00:39.749796 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestCatalogListNodesCommand - 2019/12/30 19:00:39.751167 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListNodesCommand - 2019/12/30 19:00:39.751253 [INFO] agent: Endpoints down
--- PASS: TestCatalogListNodesCommand (3.82s)
    --- PASS: TestCatalogListNodesCommand/simple (0.02s)
    --- PASS: TestCatalogListNodesCommand/detailed (0.01s)
    --- PASS: TestCatalogListNodesCommand/node-meta (0.01s)
    --- PASS: TestCatalogListNodesCommand/filter (0.02s)
    --- PASS: TestCatalogListNodesCommand/near (0.02s)
    --- PASS: TestCatalogListNodesCommand/service_present (0.06s)
    --- PASS: TestCatalogListNodesCommand/service_missing (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/nodes	4.144s
=== RUN   TestCatalogListServicesCommand_noTabs
=== PAUSE TestCatalogListServicesCommand_noTabs
=== RUN   TestCatalogListServicesCommand_Validation
=== PAUSE TestCatalogListServicesCommand_Validation
=== RUN   TestCatalogListServicesCommand
=== PAUSE TestCatalogListServicesCommand
=== CONT  TestCatalogListServicesCommand_noTabs
=== CONT  TestCatalogListServicesCommand
=== CONT  TestCatalogListServicesCommand_Validation
--- PASS: TestCatalogListServicesCommand_Validation (0.01s)
--- PASS: TestCatalogListServicesCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListServicesCommand - 2019/12/30 19:01:04.835623 [WARN] agent: Node name "Node e371371c-08b6-50c8-42be-1d60f9fb9f81" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListServicesCommand - 2019/12/30 19:01:04.836810 [DEBUG] tlsutil: Update with version 1
TestCatalogListServicesCommand - 2019/12/30 19:01:04.844283 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:01:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e371371c-08b6-50c8-42be-1d60f9fb9f81 Address:127.0.0.1:41506}]
2019/12/30 19:01:06 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
TestCatalogListServicesCommand - 2019/12/30 19:01:06.131633 [INFO] serf: EventMemberJoin: Node e371371c-08b6-50c8-42be-1d60f9fb9f81.dc1 127.0.0.1
TestCatalogListServicesCommand - 2019/12/30 19:01:06.137422 [INFO] serf: EventMemberJoin: Node e371371c-08b6-50c8-42be-1d60f9fb9f81 127.0.0.1
TestCatalogListServicesCommand - 2019/12/30 19:01:06.140643 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestCatalogListServicesCommand - 2019/12/30 19:01:06.142767 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestCatalogListServicesCommand - 2019/12/30 19:01:06.145159 [INFO] consul: Handled member-join event for server "Node e371371c-08b6-50c8-42be-1d60f9fb9f81.dc1" in area "wan"
TestCatalogListServicesCommand - 2019/12/30 19:01:06.147808 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestCatalogListServicesCommand - 2019/12/30 19:01:06.148296 [INFO] agent: started state syncer
TestCatalogListServicesCommand - 2019/12/30 19:01:06.150469 [INFO] consul: Adding LAN server Node e371371c-08b6-50c8-42be-1d60f9fb9f81 (Addr: tcp/127.0.0.1:41506) (DC: dc1)
2019/12/30 19:01:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:01:06 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/12/30 19:01:06 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:01:06 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestCatalogListServicesCommand - 2019/12/30 19:01:06.999862 [INFO] consul: cluster leadership acquired
TestCatalogListServicesCommand - 2019/12/30 19:01:07.000485 [INFO] consul: New leader elected: Node e371371c-08b6-50c8-42be-1d60f9fb9f81
TestCatalogListServicesCommand - 2019/12/30 19:01:07.567059 [INFO] agent: Synced node info
TestCatalogListServicesCommand - 2019/12/30 19:01:08.025995 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/12/30 19:01:08.026123 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/12/30 19:01:08.791647 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogListServicesCommand - 2019/12/30 19:01:08.792096 [DEBUG] consul: Skipping self join check for "Node e371371c-08b6-50c8-42be-1d60f9fb9f81" since the cluster is too small
TestCatalogListServicesCommand - 2019/12/30 19:01:08.792253 [INFO] consul: member 'Node e371371c-08b6-50c8-42be-1d60f9fb9f81' joined, marking health alive
TestCatalogListServicesCommand - 2019/12/30 19:01:09.367271 [INFO] agent: Synced service "testing"
TestCatalogListServicesCommand - 2019/12/30 19:01:09.367348 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/12/30 19:01:09.367448 [DEBUG] http: Request PUT /v1/agent/service/register (365.796444ms) from=127.0.0.1:39626
TestCatalogListServicesCommand - 2019/12/30 19:01:09.367531 [DEBUG] agent: Service "testing" in sync
TestCatalogListServicesCommand - 2019/12/30 19:01:09.367577 [DEBUG] agent: Node info in sync
=== RUN   TestCatalogListServicesCommand/simple
TestCatalogListServicesCommand - 2019/12/30 19:01:09.373993 [DEBUG] http: Request GET /v1/catalog/services (1.839049ms) from=127.0.0.1:39628
=== RUN   TestCatalogListServicesCommand/tags
TestCatalogListServicesCommand - 2019/12/30 19:01:09.392476 [DEBUG] http: Request GET /v1/catalog/services (6.925185ms) from=127.0.0.1:39630
=== RUN   TestCatalogListServicesCommand/node_missing
TestCatalogListServicesCommand - 2019/12/30 19:01:09.407868 [DEBUG] http: Request GET /v1/catalog/node/not-a-real-node (1.780714ms) from=127.0.0.1:39632
=== RUN   TestCatalogListServicesCommand/node_present
TestCatalogListServicesCommand - 2019/12/30 19:01:09.414945 [DEBUG] http: Request GET /v1/catalog/node/Node%20e371371c-08b6-50c8-42be-1d60f9fb9f81 (1.954052ms) from=127.0.0.1:39634
=== RUN   TestCatalogListServicesCommand/node-meta
TestCatalogListServicesCommand - 2019/12/30 19:01:09.425335 [DEBUG] http: Request GET /v1/catalog/services?node-meta=foo%3Abar (1.890051ms) from=127.0.0.1:39636
TestCatalogListServicesCommand - 2019/12/30 19:01:09.427018 [INFO] agent: Requesting shutdown
TestCatalogListServicesCommand - 2019/12/30 19:01:09.427262 [INFO] consul: shutting down server
TestCatalogListServicesCommand - 2019/12/30 19:01:09.427481 [WARN] serf: Shutdown without a Leave
TestCatalogListServicesCommand - 2019/12/30 19:01:09.515862 [WARN] serf: Shutdown without a Leave
TestCatalogListServicesCommand - 2019/12/30 19:01:09.600081 [INFO] agent: consul server down
TestCatalogListServicesCommand - 2019/12/30 19:01:09.600422 [INFO] agent: shutdown complete
TestCatalogListServicesCommand - 2019/12/30 19:01:09.600640 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestCatalogListServicesCommand - 2019/12/30 19:01:09.600425 [INFO] manager: shutting down
TestCatalogListServicesCommand - 2019/12/30 19:01:09.600972 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestCatalogListServicesCommand - 2019/12/30 19:01:09.601232 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestCatalogListServicesCommand - 2019/12/30 19:01:09.602928 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListServicesCommand - 2019/12/30 19:01:09.603285 [INFO] agent: Endpoints down
--- PASS: TestCatalogListServicesCommand (4.86s)
    --- PASS: TestCatalogListServicesCommand/simple (0.01s)
    --- PASS: TestCatalogListServicesCommand/tags (0.01s)
    --- PASS: TestCatalogListServicesCommand/node_missing (0.01s)
    --- PASS: TestCatalogListServicesCommand/node_present (0.01s)
    --- PASS: TestCatalogListServicesCommand/node-meta (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/services	5.210s
?   	github.com/hashicorp/consul/command/config	[no test files]
=== RUN   TestConfigDelete_noTabs
=== PAUSE TestConfigDelete_noTabs
=== RUN   TestConfigDelete
=== PAUSE TestConfigDelete
=== RUN   TestConfigDelete_InvalidArgs
=== PAUSE TestConfigDelete_InvalidArgs
=== CONT  TestConfigDelete_noTabs
=== CONT  TestConfigDelete_InvalidArgs
=== RUN   TestConfigDelete_InvalidArgs/no_kind
=== CONT  TestConfigDelete
=== RUN   TestConfigDelete_InvalidArgs/no_name
--- PASS: TestConfigDelete_noTabs (0.01s)
--- PASS: TestConfigDelete_InvalidArgs (0.01s)
    --- PASS: TestConfigDelete_InvalidArgs/no_kind (0.01s)
    --- PASS: TestConfigDelete_InvalidArgs/no_name (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConfigDelete - 2019/12/30 19:01:04.946398 [WARN] agent: Node name "Node e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigDelete - 2019/12/30 19:01:04.947258 [DEBUG] tlsutil: Update with version 1
TestConfigDelete - 2019/12/30 19:01:04.953624 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:01:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44 Address:127.0.0.1:52006}]
2019/12/30 19:01:06 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestConfigDelete - 2019/12/30 19:01:06.273997 [INFO] serf: EventMemberJoin: Node e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44.dc1 127.0.0.1
TestConfigDelete - 2019/12/30 19:01:06.282633 [INFO] serf: EventMemberJoin: Node e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44 127.0.0.1
TestConfigDelete - 2019/12/30 19:01:06.286730 [INFO] consul: Handled member-join event for server "Node e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44.dc1" in area "wan"
TestConfigDelete - 2019/12/30 19:01:06.287351 [INFO] consul: Adding LAN server Node e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestConfigDelete - 2019/12/30 19:01:06.289594 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestConfigDelete - 2019/12/30 19:01:06.291367 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestConfigDelete - 2019/12/30 19:01:06.294155 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestConfigDelete - 2019/12/30 19:01:06.294487 [INFO] agent: started state syncer
2019/12/30 19:01:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:01:06 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/30 19:01:07 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:01:07 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestConfigDelete - 2019/12/30 19:01:07.209497 [INFO] consul: cluster leadership acquired
TestConfigDelete - 2019/12/30 19:01:07.210273 [INFO] consul: New leader elected: Node e4d67fc1-2b8b-088a-c8e2-e3f77bf44d44
TestConfigDelete - 2019/12/30 19:01:07.824846 [INFO] agent: Synced node info
TestConfigDelete - 2019/12/30 19:01:07.824982 [DEBUG] agent: Node info in sync
TestConfigDelete - 2019/12/30 19:01:07.829113 [DEBUG] http: Request PUT /v1/config (428.176778ms) from=127.0.0.1:35632
TestConfigDelete - 2019/12/30 19:01:08.534173 [DEBUG] http: Request DELETE /v1/config/service-defaults/web (700.950069ms) from=127.0.0.1:35634
TestConfigDelete - 2019/12/30 19:01:08.541976 [ERR] http: Request GET /v1/config/service-defaults/web, error: Config entry not found for "service-defaults" / "web" from=127.0.0.1:35632
TestConfigDelete - 2019/12/30 19:01:08.543538 [DEBUG] http: Request GET /v1/config/service-defaults/web (1.959052ms) from=127.0.0.1:35632
TestConfigDelete - 2019/12/30 19:01:08.545530 [INFO] agent: Requesting shutdown
TestConfigDelete - 2019/12/30 19:01:08.545782 [INFO] consul: shutting down server
TestConfigDelete - 2019/12/30 19:01:08.545922 [WARN] serf: Shutdown without a Leave
TestConfigDelete - 2019/12/30 19:01:08.716400 [WARN] serf: Shutdown without a Leave
TestConfigDelete - 2019/12/30 19:01:08.794632 [INFO] manager: shutting down
TestConfigDelete - 2019/12/30 19:01:08.982918 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestConfigDelete - 2019/12/30 19:01:08.983738 [INFO] agent: consul server down
TestConfigDelete - 2019/12/30 19:01:08.983830 [INFO] agent: shutdown complete
TestConfigDelete - 2019/12/30 19:01:08.984026 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestConfigDelete - 2019/12/30 19:01:08.984302 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestConfigDelete - 2019/12/30 19:01:08.984615 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestConfigDelete - 2019/12/30 19:01:08.986264 [INFO] agent: Waiting for endpoints to shut down
TestConfigDelete - 2019/12/30 19:01:08.986450 [INFO] agent: Endpoints down
--- PASS: TestConfigDelete (4.11s)
PASS
ok  	github.com/hashicorp/consul/command/config/delete	4.360s
=== RUN   TestConfigList_noTabs
=== PAUSE TestConfigList_noTabs
=== RUN   TestConfigList
WARNING: bootstrap = true: do not enable unless necessary
TestConfigList - 2019/12/30 19:01:26.706840 [WARN] agent: Node name "Node 3dded2c0-762b-d7f3-b2d1-bf4ec3be0677" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigList - 2019/12/30 19:01:26.707668 [DEBUG] tlsutil: Update with version 1
TestConfigList - 2019/12/30 19:01:26.714200 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:01:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3dded2c0-762b-d7f3-b2d1-bf4ec3be0677 Address:127.0.0.1:13006}]
2019/12/30 19:01:27 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestConfigList - 2019/12/30 19:01:27.763057 [INFO] serf: EventMemberJoin: Node 3dded2c0-762b-d7f3-b2d1-bf4ec3be0677.dc1 127.0.0.1
TestConfigList - 2019/12/30 19:01:27.766621 [INFO] serf: EventMemberJoin: Node 3dded2c0-762b-d7f3-b2d1-bf4ec3be0677 127.0.0.1
TestConfigList - 2019/12/30 19:01:27.768052 [INFO] consul: Adding LAN server Node 3dded2c0-762b-d7f3-b2d1-bf4ec3be0677 (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestConfigList - 2019/12/30 19:01:27.768179 [INFO] consul: Handled member-join event for server "Node 3dded2c0-762b-d7f3-b2d1-bf4ec3be0677.dc1" in area "wan"
TestConfigList - 2019/12/30 19:01:27.768737 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestConfigList - 2019/12/30 19:01:27.769080 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestConfigList - 2019/12/30 19:01:27.771994 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestConfigList - 2019/12/30 19:01:27.772149 [INFO] agent: started state syncer
2019/12/30 19:01:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:01:27 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/12/30 19:01:28 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:01:28 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestConfigList - 2019/12/30 19:01:28.317055 [INFO] consul: cluster leadership acquired
TestConfigList - 2019/12/30 19:01:28.317622 [INFO] consul: New leader elected: Node 3dded2c0-762b-d7f3-b2d1-bf4ec3be0677
TestConfigList - 2019/12/30 19:01:28.763067 [INFO] agent: Synced node info
TestConfigList - 2019/12/30 19:01:28.764183 [DEBUG] http: Request PUT /v1/config (385.764303ms) from=127.0.0.1:55732
TestConfigList - 2019/12/30 19:01:29.260700 [DEBUG] http: Request PUT /v1/config (489.169065ms) from=127.0.0.1:55732
TestConfigList - 2019/12/30 19:01:29.577382 [DEBUG] http: Request PUT /v1/config (308.139229ms) from=127.0.0.1:55732
TestConfigList - 2019/12/30 19:01:29.583296 [DEBUG] http: Request GET /v1/config/service-defaults (1.576709ms) from=127.0.0.1:55736
TestConfigList - 2019/12/30 19:01:29.586611 [INFO] agent: Requesting shutdown
TestConfigList - 2019/12/30 19:01:29.586722 [INFO] consul: shutting down server
TestConfigList - 2019/12/30 19:01:29.586775 [WARN] serf: Shutdown without a Leave
TestConfigList - 2019/12/30 19:01:29.758066 [WARN] serf: Shutdown without a Leave
TestConfigList - 2019/12/30 19:01:29.941440 [INFO] manager: shutting down
TestConfigList - 2019/12/30 19:01:30.074992 [INFO] agent: consul server down
TestConfigList - 2019/12/30 19:01:30.075070 [INFO] agent: shutdown complete
TestConfigList - 2019/12/30 19:01:30.075147 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestConfigList - 2019/12/30 19:01:30.075271 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestConfigList - 2019/12/30 19:01:30.075410 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestConfigList - 2019/12/30 19:01:30.076045 [INFO] agent: Waiting for endpoints to shut down
TestConfigList - 2019/12/30 19:01:30.076155 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestConfigList - 2019/12/30 19:01:30.076323 [INFO] agent: Endpoints down
TestConfigList - 2019/12/30 19:01:30.076395 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
--- PASS: TestConfigList (3.44s)
TestConfigList - 2019/12/30 19:01:30.076459 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
=== RUN   TestConfigList_InvalidArgs
=== PAUSE TestConfigList_InvalidArgs
=== CONT  TestConfigList_noTabs
--- PASS: TestConfigList_noTabs (0.00s)
=== CONT  TestConfigList_InvalidArgs
=== RUN   TestConfigList_InvalidArgs/no_kind
--- PASS: TestConfigList_InvalidArgs (0.00s)
    --- PASS: TestConfigList_InvalidArgs/no_kind (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/config/list	3.783s
=== RUN   TestConfigRead_noTabs
=== PAUSE TestConfigRead_noTabs
=== RUN   TestConfigRead
=== PAUSE TestConfigRead
=== RUN   TestConfigRead_InvalidArgs
=== PAUSE TestConfigRead_InvalidArgs
=== CONT  TestConfigRead_noTabs
--- PASS: TestConfigRead_noTabs (0.00s)
=== CONT  TestConfigRead_InvalidArgs
=== RUN   TestConfigRead_InvalidArgs/no_kind
=== RUN   TestConfigRead_InvalidArgs/no_name
=== CONT  TestConfigRead
--- PASS: TestConfigRead_InvalidArgs (0.01s)
    --- PASS: TestConfigRead_InvalidArgs/no_kind (0.00s)
    --- PASS: TestConfigRead_InvalidArgs/no_name (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConfigRead - 2019/12/30 19:01:26.267852 [WARN] agent: Node name "Node a27270ca-6217-54f4-1389-33cd9a9b6323" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigRead - 2019/12/30 19:01:26.268803 [DEBUG] tlsutil: Update with version 1
TestConfigRead - 2019/12/30 19:01:26.277179 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:01:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a27270ca-6217-54f4-1389-33cd9a9b6323 Address:127.0.0.1:17506}]
2019/12/30 19:01:27 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestConfigRead - 2019/12/30 19:01:27.197146 [INFO] serf: EventMemberJoin: Node a27270ca-6217-54f4-1389-33cd9a9b6323.dc1 127.0.0.1
TestConfigRead - 2019/12/30 19:01:27.200875 [INFO] serf: EventMemberJoin: Node a27270ca-6217-54f4-1389-33cd9a9b6323 127.0.0.1
TestConfigRead - 2019/12/30 19:01:27.202125 [INFO] consul: Handled member-join event for server "Node a27270ca-6217-54f4-1389-33cd9a9b6323.dc1" in area "wan"
TestConfigRead - 2019/12/30 19:01:27.202223 [INFO] consul: Adding LAN server Node a27270ca-6217-54f4-1389-33cd9a9b6323 (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestConfigRead - 2019/12/30 19:01:27.202788 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestConfigRead - 2019/12/30 19:01:27.203187 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestConfigRead - 2019/12/30 19:01:27.206348 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestConfigRead - 2019/12/30 19:01:27.206666 [INFO] agent: started state syncer
2019/12/30 19:01:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:01:27 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/12/30 19:01:27 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:01:27 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestConfigRead - 2019/12/30 19:01:27.833741 [INFO] consul: cluster leadership acquired
TestConfigRead - 2019/12/30 19:01:27.834262 [INFO] consul: New leader elected: Node a27270ca-6217-54f4-1389-33cd9a9b6323
TestConfigRead - 2019/12/30 19:01:28.159606 [INFO] agent: Synced node info
TestConfigRead - 2019/12/30 19:01:28.159743 [DEBUG] agent: Node info in sync
TestConfigRead - 2019/12/30 19:01:28.473033 [DEBUG] agent: Node info in sync
TestConfigRead - 2019/12/30 19:01:28.486667 [DEBUG] http: Request PUT /v1/config (454.936818ms) from=127.0.0.1:47116
TestConfigRead - 2019/12/30 19:01:28.497233 [DEBUG] http: Request GET /v1/config/service-defaults/web (1.720379ms) from=127.0.0.1:47120
TestConfigRead - 2019/12/30 19:01:28.500881 [INFO] agent: Requesting shutdown
TestConfigRead - 2019/12/30 19:01:28.500992 [INFO] consul: shutting down server
TestConfigRead - 2019/12/30 19:01:28.501053 [WARN] serf: Shutdown without a Leave
TestConfigRead - 2019/12/30 19:01:28.658068 [WARN] serf: Shutdown without a Leave
TestConfigRead - 2019/12/30 19:01:28.749820 [INFO] manager: shutting down
TestConfigRead - 2019/12/30 19:01:28.749851 [ERR] consul: failed to establish leadership: raft is already shutdown
TestConfigRead - 2019/12/30 19:01:28.750248 [INFO] agent: consul server down
TestConfigRead - 2019/12/30 19:01:28.750320 [INFO] agent: shutdown complete
TestConfigRead - 2019/12/30 19:01:28.750376 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestConfigRead - 2019/12/30 19:01:28.750522 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestConfigRead - 2019/12/30 19:01:28.750674 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestConfigRead - 2019/12/30 19:01:28.751199 [INFO] agent: Waiting for endpoints to shut down
TestConfigRead - 2019/12/30 19:01:28.751378 [INFO] agent: Endpoints down
--- PASS: TestConfigRead (2.56s)
PASS
ok  	github.com/hashicorp/consul/command/config/read	2.826s
=== RUN   TestConfigWrite_noTabs
=== PAUSE TestConfigWrite_noTabs
=== RUN   TestConfigWrite
=== PAUSE TestConfigWrite
=== CONT  TestConfigWrite_noTabs
=== CONT  TestConfigWrite
--- PASS: TestConfigWrite_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConfigWrite - 2019/12/30 19:01:58.529169 [WARN] agent: Node name "Node c93efd31-17ec-7aa4-863f-358eee13cd76" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigWrite - 2019/12/30 19:01:58.530681 [DEBUG] tlsutil: Update with version 1
TestConfigWrite - 2019/12/30 19:01:58.539012 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:01:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c93efd31-17ec-7aa4-863f-358eee13cd76 Address:127.0.0.1:20506}]
2019/12/30 19:01:59 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestConfigWrite - 2019/12/30 19:01:59.705919 [INFO] serf: EventMemberJoin: Node c93efd31-17ec-7aa4-863f-358eee13cd76.dc1 127.0.0.1
TestConfigWrite - 2019/12/30 19:01:59.709911 [INFO] serf: EventMemberJoin: Node c93efd31-17ec-7aa4-863f-358eee13cd76 127.0.0.1
TestConfigWrite - 2019/12/30 19:01:59.711310 [INFO] consul: Adding LAN server Node c93efd31-17ec-7aa4-863f-358eee13cd76 (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestConfigWrite - 2019/12/30 19:01:59.711702 [INFO] consul: Handled member-join event for server "Node c93efd31-17ec-7aa4-863f-358eee13cd76.dc1" in area "wan"
TestConfigWrite - 2019/12/30 19:01:59.712278 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestConfigWrite - 2019/12/30 19:01:59.712351 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestConfigWrite - 2019/12/30 19:01:59.715731 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestConfigWrite - 2019/12/30 19:01:59.715885 [INFO] agent: started state syncer
2019/12/30 19:01:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:01:59 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/30 19:02:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:02:00 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestConfigWrite - 2019/12/30 19:02:00.200863 [INFO] consul: cluster leadership acquired
TestConfigWrite - 2019/12/30 19:02:00.201403 [INFO] consul: New leader elected: Node c93efd31-17ec-7aa4-863f-358eee13cd76
=== RUN   TestConfigWrite/File
TestConfigWrite - 2019/12/30 19:02:00.840336 [INFO] agent: Synced node info
TestConfigWrite - 2019/12/30 19:02:00.841995 [DEBUG] http: Request PUT /v1/config (561.023967ms) from=127.0.0.1:33224
TestConfigWrite - 2019/12/30 19:02:00.856614 [DEBUG] http: Request GET /v1/config/service-defaults/web (2.340729ms) from=127.0.0.1:33226
=== RUN   TestConfigWrite/Stdin
TestConfigWrite - 2019/12/30 19:02:01.261924 [DEBUG] http: Request PUT /v1/config (396.053899ms) from=127.0.0.1:33228
TestConfigWrite - 2019/12/30 19:02:01.266953 [DEBUG] http: Request GET /v1/config/proxy-defaults/global (2.324729ms) from=127.0.0.1:33226
=== RUN   TestConfigWrite/No_config
TestConfigWrite - 2019/12/30 19:02:01.271672 [INFO] agent: Requesting shutdown
TestConfigWrite - 2019/12/30 19:02:01.271799 [INFO] consul: shutting down server
TestConfigWrite - 2019/12/30 19:02:01.271853 [WARN] serf: Shutdown without a Leave
TestConfigWrite - 2019/12/30 19:02:01.384591 [WARN] serf: Shutdown without a Leave
TestConfigWrite - 2019/12/30 19:02:01.884504 [INFO] manager: shutting down
TestConfigWrite - 2019/12/30 19:02:02.209110 [INFO] agent: consul server down
TestConfigWrite - 2019/12/30 19:02:02.209230 [INFO] agent: shutdown complete
TestConfigWrite - 2019/12/30 19:02:02.209327 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestConfigWrite - 2019/12/30 19:02:02.209682 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestConfigWrite - 2019/12/30 19:02:02.209916 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestConfigWrite - 2019/12/30 19:02:02.211102 [INFO] agent: Waiting for endpoints to shut down
TestConfigWrite - 2019/12/30 19:02:02.211301 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestConfigWrite - 2019/12/30 19:02:02.211582 [INFO] agent: Endpoints down
--- PASS: TestConfigWrite (3.76s)
    --- PASS: TestConfigWrite/File (0.59s)
    --- PASS: TestConfigWrite/Stdin (0.41s)
    --- PASS: TestConfigWrite/No_config (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/config/write	4.152s
=== RUN   TestConnectCommand_noTabs
=== PAUSE TestConnectCommand_noTabs
=== CONT  TestConnectCommand_noTabs
--- PASS: TestConnectCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect	0.042s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca	0.048s
=== RUN   TestConnectCAGetConfigCommand_noTabs
=== PAUSE TestConnectCAGetConfigCommand_noTabs
=== RUN   TestConnectCAGetConfigCommand
=== PAUSE TestConnectCAGetConfigCommand
=== CONT  TestConnectCAGetConfigCommand_noTabs
=== CONT  TestConnectCAGetConfigCommand
--- PASS: TestConnectCAGetConfigCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.223798 [WARN] agent: Node name "Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.224909 [DEBUG] tlsutil: Update with version 1
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.231379 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:02:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ecbd28b8-074a-7565-ea6a-a69a7d83ce68 Address:127.0.0.1:23506}]
2019/12/30 19:02:15 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.946975 [INFO] serf: EventMemberJoin: Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68.dc1 127.0.0.1
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.951992 [INFO] serf: EventMemberJoin: Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68 127.0.0.1
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.953865 [INFO] consul: Adding LAN server Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.953958 [INFO] consul: Handled member-join event for server "Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68.dc1" in area "wan"
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.954614 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.955201 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.958011 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:15.958158 [INFO] agent: started state syncer
2019/12/30 19:02:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:02:15 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 19:02:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:02:17 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestConnectCAGetConfigCommand - 2019/12/30 19:02:17.001155 [INFO] consul: cluster leadership acquired
TestConnectCAGetConfigCommand - 2019/12/30 19:02:17.001721 [INFO] consul: New leader elected: Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68
TestConnectCAGetConfigCommand - 2019/12/30 19:02:17.276695 [INFO] agent: Synced node info
TestConnectCAGetConfigCommand - 2019/12/30 19:02:17.830388 [DEBUG] agent: Node info in sync
TestConnectCAGetConfigCommand - 2019/12/30 19:02:17.830525 [DEBUG] agent: Node info in sync
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.101590 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.102075 [DEBUG] consul: Skipping self join check for "Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68" since the cluster is too small
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.102259 [INFO] consul: member 'Node ecbd28b8-074a-7565-ea6a-a69a7d83ce68' joined, marking health alive
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.279882 [DEBUG] http: Request GET /v1/connect/ca/configuration (1.770714ms) from=127.0.0.1:37880
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.283561 [INFO] agent: Requesting shutdown
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.283700 [INFO] consul: shutting down server
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.283799 [WARN] serf: Shutdown without a Leave
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.342405 [WARN] serf: Shutdown without a Leave
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.400907 [INFO] manager: shutting down
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.401458 [INFO] agent: consul server down
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.401525 [INFO] agent: shutdown complete
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.401614 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.401780 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.401955 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.402487 [INFO] agent: Waiting for endpoints to shut down
TestConnectCAGetConfigCommand - 2019/12/30 19:02:18.402710 [INFO] agent: Endpoints down
--- PASS: TestConnectCAGetConfigCommand (3.25s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca/get	3.574s
=== RUN   TestConnectCASetConfigCommand_noTabs
=== PAUSE TestConnectCASetConfigCommand_noTabs
=== RUN   TestConnectCASetConfigCommand
=== PAUSE TestConnectCASetConfigCommand
=== CONT  TestConnectCASetConfigCommand_noTabs
=== CONT  TestConnectCASetConfigCommand
--- PASS: TestConnectCASetConfigCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCASetConfigCommand - 2019/12/30 19:02:18.666511 [WARN] agent: Node name "Node da00e0ed-1761-b1ed-a604-0f17d1f49316" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCASetConfigCommand - 2019/12/30 19:02:18.667530 [DEBUG] tlsutil: Update with version 1
TestConnectCASetConfigCommand - 2019/12/30 19:02:18.674142 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:02:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:da00e0ed-1761-b1ed-a604-0f17d1f49316 Address:127.0.0.1:38506}]
2019/12/30 19:02:19 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.430208 [INFO] serf: EventMemberJoin: Node da00e0ed-1761-b1ed-a604-0f17d1f49316.dc1 127.0.0.1
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.433744 [INFO] serf: EventMemberJoin: Node da00e0ed-1761-b1ed-a604-0f17d1f49316 127.0.0.1
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.435327 [INFO] consul: Adding LAN server Node da00e0ed-1761-b1ed-a604-0f17d1f49316 (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.435416 [INFO] consul: Handled member-join event for server "Node da00e0ed-1761-b1ed-a604-0f17d1f49316.dc1" in area "wan"
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.435837 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.441527 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.444363 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.444610 [INFO] agent: started state syncer
2019/12/30 19:02:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:02:19 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/12/30 19:02:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:02:19 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.893337 [INFO] consul: cluster leadership acquired
TestConnectCASetConfigCommand - 2019/12/30 19:02:19.893956 [INFO] consul: New leader elected: Node da00e0ed-1761-b1ed-a604-0f17d1f49316
TestConnectCASetConfigCommand - 2019/12/30 19:02:20.385521 [INFO] agent: Synced node info
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.293798 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.295029 [DEBUG] consul: Skipping self join check for "Node da00e0ed-1761-b1ed-a604-0f17d1f49316" since the cluster is too small
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.295354 [INFO] consul: member 'Node da00e0ed-1761-b1ed-a604-0f17d1f49316' joined, marking health alive
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.668201 [INFO] connect: CA provider config updated
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.668433 [DEBUG] http: Request PUT /v1/connect/ca/configuration (182.30686ms) from=127.0.0.1:47498
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.670037 [INFO] agent: Requesting shutdown
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.670133 [INFO] consul: shutting down server
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.670191 [WARN] serf: Shutdown without a Leave
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.717539 [WARN] serf: Shutdown without a Leave
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.767757 [INFO] manager: shutting down
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.768669 [INFO] agent: consul server down
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.768843 [INFO] agent: shutdown complete
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.768947 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.769623 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.769956 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.771377 [INFO] agent: Waiting for endpoints to shut down
TestConnectCASetConfigCommand - 2019/12/30 19:02:21.771492 [INFO] agent: Endpoints down
--- PASS: TestConnectCASetConfigCommand (3.25s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca/set	3.749s
=== RUN   TestBootstrapConfig_ConfigureArgs
=== RUN   TestBootstrapConfig_ConfigureArgs/defaults
=== RUN   TestBootstrapConfig_ConfigureArgs/extra-stats-sinks
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-statsd-sink
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-plus-extra
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-env
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-unix-sink
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink-env
=== RUN   TestBootstrapConfig_ConfigureArgs/stats-config-override
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-tags
=== RUN   TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr
=== RUN   TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr-with-overrides
=== RUN   TestBootstrapConfig_ConfigureArgs/stats-flush-interval
=== RUN   TestBootstrapConfig_ConfigureArgs/override-tracing
=== RUN   TestBootstrapConfig_ConfigureArgs/err-bad-prometheus-addr
=== RUN   TestBootstrapConfig_ConfigureArgs/err-bad-statsd-addr
=== RUN   TestBootstrapConfig_ConfigureArgs/err-bad-dogstatsd-addr
--- PASS: TestBootstrapConfig_ConfigureArgs (0.02s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/defaults (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/extra-stats-sinks (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-statsd-sink (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-plus-extra (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-env (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-unix-sink (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink-env (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/stats-config-override (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-tags (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr-with-overrides (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/stats-flush-interval (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/override-tracing (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/err-bad-prometheus-addr (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/err-bad-statsd-addr (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/err-bad-dogstatsd-addr (0.00s)
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== RUN   TestGenerateConfig
=== RUN   TestGenerateConfig/no-args
=== RUN   TestGenerateConfig/defaults
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-arg
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-file-arg
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-file-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/grpc-addr-flag
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 9999
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/grpc-addr-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 9999
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/access-log-path
{
  "admin": {
    "access_log_path": "/some/path/access.log",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/custom-bootstrap

				{
					"admin": {
						"access_log_path": "/dev/null",
						"address": {
							"socket_address": {
								"address": "127.0.0.1",
								"port_value": 19000
							}
						}
					},
					"node": {
						"cluster": "test-proxy",
						"id": "test-proxy"
					},
					custom_field = "foo"
				}=== RUN   TestGenerateConfig/extra_-single
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      },
      
				{
					"name": "fake_cluster_1"
				}
    ],
    "listeners": [
      
				{
					"name": "fake_listener_1"
				}
    ]
  },
  "stats_sinks": [

				{
					"name": "fake_sink_1"
				}
],
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/extra_-multiple
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      },
      
				{
					"name": "fake_cluster_1"
				},
				{
					"name": "fake_cluster_2"
				}
    ],
    "listeners": [
      
				{
					"name": "fake_listener_1"
				},{
					"name": "fake_listener_2"
				}
    ]
  },
  "stats_sinks": [

				{
					"name": "fake_sink_1"
				} , { "name": "fake_sink_2" }
],
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/stats-config-override
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": 
				{
					"name": "fake_config"
				},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/zipkin-tracing-config
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      },
      {
					"name": "zipkin",
					"type": "STRICT_DNS",
					"connect_timeout": "5s",
					"load_assignment": {
						"cluster_name": "zipkin",
						"endpoints": [
							{
								"lb_endpoints": [
									{
										"endpoint": {
											"address": {
												"socket_address": {
													"address": "zipkin.service.consul",
													"port_value": 9411
												}
											}
										}
									}
								]
							}
						]
					}
				}
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "tracing": {
					"http": {
						"name": "envoy.zipkin",
						"config": {
							"collector_cluster": "zipkin",
							"collector_endpoint": "/api/v1/spans"
						}
					}
				},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
--- PASS: TestGenerateConfig (0.48s)
    --- PASS: TestGenerateConfig/no-args (0.01s)
    --- PASS: TestGenerateConfig/defaults (0.08s)
    --- PASS: TestGenerateConfig/token-arg (0.04s)
    --- PASS: TestGenerateConfig/token-env (0.03s)
    --- PASS: TestGenerateConfig/token-file-arg (0.04s)
    --- PASS: TestGenerateConfig/token-file-env (0.03s)
    --- PASS: TestGenerateConfig/grpc-addr-flag (0.02s)
    --- PASS: TestGenerateConfig/grpc-addr-env (0.03s)
    --- PASS: TestGenerateConfig/access-log-path (0.03s)
    --- PASS: TestGenerateConfig/custom-bootstrap (0.02s)
    --- PASS: TestGenerateConfig/extra_-single (0.05s)
    --- PASS: TestGenerateConfig/extra_-multiple (0.03s)
    --- PASS: TestGenerateConfig/stats-config-override (0.03s)
    --- PASS: TestGenerateConfig/zipkin-tracing-config (0.03s)
=== RUN   TestExecEnvoy
=== RUN   TestExecEnvoy/default
=== RUN   TestExecEnvoy/hot-restart-epoch
=== RUN   TestExecEnvoy/hot-restart-version
=== RUN   TestExecEnvoy/hot-restart-version#01
=== RUN   TestExecEnvoy/hot-restart-version#02
--- PASS: TestExecEnvoy (3.28s)
    --- PASS: TestExecEnvoy/default (0.68s)
    --- PASS: TestExecEnvoy/hot-restart-epoch (0.59s)
    --- PASS: TestExecEnvoy/hot-restart-version (0.65s)
    --- PASS: TestExecEnvoy/hot-restart-version#01 (0.66s)
    --- PASS: TestExecEnvoy/hot-restart-version#02 (0.70s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/envoy	3.949s
=== RUN   TestConnectEnvoyPipeBootstrapCommand_noTabs
=== PAUSE TestConnectEnvoyPipeBootstrapCommand_noTabs
=== CONT  TestConnectEnvoyPipeBootstrapCommand_noTabs
--- PASS: TestConnectEnvoyPipeBootstrapCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap	0.075s
=== RUN   TestFlagUpstreams_impl
--- PASS: TestFlagUpstreams_impl (0.00s)
=== RUN   TestFlagUpstreams
=== RUN   TestFlagUpstreams/bad_format
=== RUN   TestFlagUpstreams/port_not_int
=== RUN   TestFlagUpstreams/4_parts
=== RUN   TestFlagUpstreams/single_value
=== RUN   TestFlagUpstreams/single_value_prepared_query
=== RUN   TestFlagUpstreams/invalid_type
=== RUN   TestFlagUpstreams/address_specified
=== RUN   TestFlagUpstreams/repeat_value,_overwrite
--- PASS: TestFlagUpstreams (0.00s)
    --- PASS: TestFlagUpstreams/bad_format (0.00s)
    --- PASS: TestFlagUpstreams/port_not_int (0.00s)
    --- PASS: TestFlagUpstreams/4_parts (0.00s)
    --- PASS: TestFlagUpstreams/single_value (0.00s)
    --- PASS: TestFlagUpstreams/single_value_prepared_query (0.00s)
    --- PASS: TestFlagUpstreams/invalid_type (0.00s)
    --- PASS: TestFlagUpstreams/address_specified (0.00s)
    --- PASS: TestFlagUpstreams/repeat_value,_overwrite (0.00s)
=== RUN   TestCommandConfigWatcher
=== PAUSE TestCommandConfigWatcher
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== RUN   TestRegisterMonitor_good
=== PAUSE TestRegisterMonitor_good
=== RUN   TestRegisterMonitor_heartbeat
=== PAUSE TestRegisterMonitor_heartbeat
=== CONT  TestCommandConfigWatcher
=== CONT  TestRegisterMonitor_heartbeat
=== RUN   TestCommandConfigWatcher/-service_flag_only
=== CONT  TestRegisterMonitor_good
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:52.941110 [WARN] agent: Node name "Node 8a9a0442-5140-a4a0-f801-192e4240d07d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:52.954676 [WARN] agent: Node name "Node ea5fcf75-906b-72aa-7ed8-ef6b557c48ef" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRegisterMonitor_good - 2019/12/30 19:03:52.967152 [WARN] agent: Node name "Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:53.116129 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_good - 2019/12/30 19:03:53.116296 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:53.117790 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_good - 2019/12/30 19:03:53.122849 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:53.123039 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:53.129511 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:03:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ca07c82a-0c86-4c10-81fa-7c1cb7b896f6 Address:127.0.0.1:46012}]
2019/12/30 19:03:57 [INFO]  raft: Node at 127.0.0.1:46012 [Follower] entering Follower state (Leader: "")
TestRegisterMonitor_good - 2019/12/30 19:03:57.351455 [INFO] serf: EventMemberJoin: Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6.dc1 127.0.0.1
TestRegisterMonitor_good - 2019/12/30 19:03:57.354719 [INFO] serf: EventMemberJoin: Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6 127.0.0.1
TestRegisterMonitor_good - 2019/12/30 19:03:57.355461 [INFO] consul: Adding LAN server Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6 (Addr: tcp/127.0.0.1:46012) (DC: dc1)
TestRegisterMonitor_good - 2019/12/30 19:03:57.355754 [INFO] consul: Handled member-join event for server "Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6.dc1" in area "wan"
TestRegisterMonitor_good - 2019/12/30 19:03:57.356067 [INFO] agent: Started DNS server 127.0.0.1:46007 (udp)
TestRegisterMonitor_good - 2019/12/30 19:03:57.356145 [INFO] agent: Started DNS server 127.0.0.1:46007 (tcp)
TestRegisterMonitor_good - 2019/12/30 19:03:57.358498 [INFO] agent: Started HTTP server on 127.0.0.1:46008 (tcp)
TestRegisterMonitor_good - 2019/12/30 19:03:57.358656 [INFO] agent: started state syncer
2019/12/30 19:03:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:03:57 [INFO]  raft: Node at 127.0.0.1:46012 [Candidate] entering Candidate state in term 2
2019/12/30 19:03:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8a9a0442-5140-a4a0-f801-192e4240d07d Address:127.0.0.1:46006}]
2019/12/30 19:03:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ea5fcf75-906b-72aa-7ed8-ef6b557c48ef Address:127.0.0.1:46018}]
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.733049 [INFO] serf: EventMemberJoin: Node ea5fcf75-906b-72aa-7ed8-ef6b557c48ef.dc1 127.0.0.1
2019/12/30 19:03:57 [INFO]  raft: Node at 127.0.0.1:46018 [Follower] entering Follower state (Leader: "")
2019/12/30 19:03:57 [INFO]  raft: Node at 127.0.0.1:46006 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.739839 [INFO] serf: EventMemberJoin: Node ea5fcf75-906b-72aa-7ed8-ef6b557c48ef 127.0.0.1
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.741577 [INFO] consul: Adding LAN server Node ea5fcf75-906b-72aa-7ed8-ef6b557c48ef (Addr: tcp/127.0.0.1:46018) (DC: dc1)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.741878 [INFO] consul: Handled member-join event for server "Node ea5fcf75-906b-72aa-7ed8-ef6b557c48ef.dc1" in area "wan"
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.744966 [INFO] serf: EventMemberJoin: Node 8a9a0442-5140-a4a0-f801-192e4240d07d.dc1 127.0.0.1
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.755550 [INFO] serf: EventMemberJoin: Node 8a9a0442-5140-a4a0-f801-192e4240d07d 127.0.0.1
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.762359 [INFO] consul: Handled member-join event for server "Node 8a9a0442-5140-a4a0-f801-192e4240d07d.dc1" in area "wan"
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.762889 [INFO] consul: Adding LAN server Node 8a9a0442-5140-a4a0-f801-192e4240d07d (Addr: tcp/127.0.0.1:46006) (DC: dc1)
2019/12/30 19:03:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:03:57 [INFO]  raft: Node at 127.0.0.1:46018 [Candidate] entering Candidate state in term 2
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.779971 [INFO] agent: Started DNS server 127.0.0.1:46013 (udp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.780395 [INFO] agent: Started DNS server 127.0.0.1:46013 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.781284 [INFO] agent: Started DNS server 127.0.0.1:46001 (udp)
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.781636 [INFO] agent: Started DNS server 127.0.0.1:46001 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.782778 [INFO] agent: Started HTTP server on 127.0.0.1:46014 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:03:57.782898 [INFO] agent: started state syncer
2019/12/30 19:03:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.783895 [INFO] agent: Started HTTP server on 127.0.0.1:46002 (tcp)
2019/12/30 19:03:57 [INFO]  raft: Node at 127.0.0.1:46006 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_heartbeat - 2019/12/30 19:03:57.783975 [INFO] agent: started state syncer
2019/12/30 19:03:59 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:03:59 [INFO]  raft: Node at 127.0.0.1:46012 [Leader] entering Leader state
TestRegisterMonitor_good - 2019/12/30 19:03:59.786905 [INFO] consul: cluster leadership acquired
TestRegisterMonitor_good - 2019/12/30 19:03:59.787726 [INFO] consul: New leader elected: Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6
TestRegisterMonitor_good - 2019/12/30 19:03:59.860349 [DEBUG] http: Request GET /v1/agent/services (3.233419ms) from=127.0.0.1:60006
TestRegisterMonitor_good - 2019/12/30 19:03:59.860748 [DEBUG] http: Request GET /v1/catalog/service/foo-proxy?stale= (2.946412ms) from=127.0.0.1:60008
TestRegisterMonitor_good - 2019/12/30 19:03:59.869565 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestRegisterMonitor_good - 2019/12/30 19:03:59.889156 [DEBUG] http: Request GET /v1/agent/services (1.53104ms) from=127.0.0.1:60006
2019/12/30 19:04:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:00 [INFO]  raft: Node at 127.0.0.1:46018 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.420708 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.421288 [INFO] consul: New leader elected: Node ea5fcf75-906b-72aa-7ed8-ef6b557c48ef
2019/12/30 19:04:00 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:00 [INFO]  raft: Node at 127.0.0.1:46006 [Leader] entering Leader state
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:00.423847 [INFO] consul: cluster leadership acquired
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:00.424598 [INFO] consul: New leader elected: Node 8a9a0442-5140-a4a0-f801-192e4240d07d
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.495710 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.495880 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.495932 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.496387 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:00.795239 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_good - 2019/12/30 19:04:00.795587 [INFO] agent: Synced service "foo-proxy"
TestRegisterMonitor_good - 2019/12/30 19:04:00.795680 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_good - 2019/12/30 19:04:00.795719 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:01.161500 [INFO] manager: shutting down
TestRegisterMonitor_good - 2019/12/30 19:04:01.176352 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_good - 2019/12/30 19:04:01.176477 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_good - 2019/12/30 19:04:01.176554 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/30 19:04:01.176660 [DEBUG] http: Request PUT /v1/agent/service/register (1.313319574s) from=127.0.0.1:60008
2019/12/30 19:04:01 [INFO] proxy: registered Consul service: foo-proxy
2019/12/30 19:04:01 [INFO] proxy: stop request received, deregistering
TestRegisterMonitor_good - 2019/12/30 19:04:01.186096 [DEBUG] agent: removed check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/12/30 19:04:01.186258 [DEBUG] agent: removed service "foo-proxy"
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:01.809232 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.445509 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.445619 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.445712 [INFO] agent: Stopping DNS server 127.0.0.1:46013 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.445971 [INFO] agent: Stopping DNS server 127.0.0.1:46013 (udp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.446216 [INFO] agent: Stopping HTTP server 127.0.0.1:46014 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.446551 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.446683 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service_flag_with_upstreams
TestCommandConfigWatcher/-service_flag_only - 2019/12/30 19:04:02.447601 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:02.459466 [INFO] agent: Synced node info
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:02.459575 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:02.516579 [WARN] agent: Node name "Node b94fcd30-cad0-846f-6b83-76106d53b9b9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:02.516955 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:02.519556 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/12/30 19:04:03.386707 [INFO] agent: Deregistered service "foo-proxy"
TestRegisterMonitor_good - 2019/12/30 19:04:04.036748 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/12/30 19:04:04.036833 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/30 19:04:04.036926 [DEBUG] http: Request PUT /v1/agent/service/deregister/foo-proxy (2.857429612s) from=127.0.0.1:60008
2019/12/30 19:04:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b94fcd30-cad0-846f-6b83-76106d53b9b9 Address:127.0.0.1:46024}]
TestRegisterMonitor_good - 2019/12/30 19:04:04.745801 [INFO] agent: Deregistered service "foo-proxy"
2019/12/30 19:04:04 [INFO]  raft: Node at 127.0.0.1:46024 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.749887 [INFO] serf: EventMemberJoin: Node b94fcd30-cad0-846f-6b83-76106d53b9b9.dc1 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.753687 [INFO] serf: EventMemberJoin: Node b94fcd30-cad0-846f-6b83-76106d53b9b9 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.754588 [INFO] consul: Handled member-join event for server "Node b94fcd30-cad0-846f-6b83-76106d53b9b9.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.754719 [INFO] consul: Adding LAN server Node b94fcd30-cad0-846f-6b83-76106d53b9b9 (Addr: tcp/127.0.0.1:46024) (DC: dc1)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.756492 [INFO] agent: Started DNS server 127.0.0.1:46019 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.756695 [INFO] agent: Started DNS server 127.0.0.1:46019 (udp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.759178 [INFO] agent: Started HTTP server on 127.0.0.1:46020 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:04.759271 [INFO] agent: started state syncer
2019/12/30 19:04:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:04 [INFO]  raft: Node at 127.0.0.1:46024 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.153896 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.155452 [DEBUG] consul: Skipping self join check for "Node 8a9a0442-5140-a4a0-f801-192e4240d07d" since the cluster is too small
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.155649 [INFO] consul: member 'Node 8a9a0442-5140-a4a0-f801-192e4240d07d' joined, marking health alive
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.429103 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/30 19:04:05.496456 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/12/30 19:04:05.496671 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/30 19:04:05.496842 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/30 19:04:05.498459 [DEBUG] http: Request GET /v1/agent/services (1.459610792s) from=127.0.0.1:60008
TestRegisterMonitor_good - 2019/12/30 19:04:05.496600 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/12/30 19:04:05.501392 [INFO] agent: Requesting shutdown
TestRegisterMonitor_good - 2019/12/30 19:04:05.501578 [INFO] consul: shutting down server
TestRegisterMonitor_good - 2019/12/30 19:04:05.501695 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_good - 2019/12/30 19:04:05.503489 [WARN] consul: error getting server health from "Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6": rpc error making call: EOF
TestRegisterMonitor_good - 2019/12/30 19:04:05.711433 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.714282 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.737003 [DEBUG] http: Request GET /v1/agent/services (483.679µs) from=127.0.0.1:51018
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.746272 [DEBUG] http: Request GET /v1/catalog/service/foo-proxy?stale= (8.366889ms) from=127.0.0.1:51020
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.765796 [DEBUG] http: Request GET /v1/agent/services (1.196698ms) from=127.0.0.1:51018
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:05.772124 [DEBUG] http: Request GET /v1/agent/checks (1.087362ms) from=127.0.0.1:51018
2019/12/30 19:04:05 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:05 [INFO]  raft: Node at 127.0.0.1:46024 [Leader] entering Leader state
TestRegisterMonitor_good - 2019/12/30 19:04:05.913991 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:05.914205 [INFO] consul: cluster leadership acquired
TestRegisterMonitor_good - 2019/12/30 19:04:05.914738 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:05.914776 [INFO] consul: New leader elected: Node b94fcd30-cad0-846f-6b83-76106d53b9b9
TestRegisterMonitor_good - 2019/12/30 19:04:05.914791 [INFO] agent: shutdown complete
TestRegisterMonitor_good - 2019/12/30 19:04:05.914862 [INFO] agent: Stopping DNS server 127.0.0.1:46007 (tcp)
TestRegisterMonitor_good - 2019/12/30 19:04:05.914999 [INFO] agent: Stopping DNS server 127.0.0.1:46007 (udp)
TestRegisterMonitor_good - 2019/12/30 19:04:05.915033 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRegisterMonitor_good - 2019/12/30 19:04:05.915143 [INFO] agent: Stopping HTTP server 127.0.0.1:46008 (tcp)
TestRegisterMonitor_good - 2019/12/30 19:04:05.915378 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestRegisterMonitor_good - 2019/12/30 19:04:05.915661 [ERR] consul: failed to reconcile member: {Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6 127.0.0.1 46010 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:ca07c82a-0c86-4c10-81fa-7c1cb7b896f6 port:46012 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:46011] alive 1 5 2 2 5 4}: raft is already shutdown
TestRegisterMonitor_good - 2019/12/30 19:04:05.915726 [INFO] agent: Waiting for endpoints to shut down
TestRegisterMonitor_good - 2019/12/30 19:04:05.915927 [INFO] agent: Endpoints down
--- PASS: TestRegisterMonitor_good (13.29s)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:06.030088 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:06.030170 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:06.030186 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:06.178967 [INFO] agent: Requesting shutdown
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:06.240972 [DEBUG] agent: Check "foo-proxy-ttl" status is now critical
TestRegisterMonitor_good - 2019/12/30 19:04:06.496749 [WARN] consul: error getting server health from "Node ca07c82a-0c86-4c10-81fa-7c1cb7b896f6": context deadline exceeded
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:06.608087 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.022092 [INFO] agent: Synced service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.022326 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.022468 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.022650 [DEBUG] http: Request PUT /v1/agent/service/register (1.272397148s) from=127.0.0.1:51020
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.022940 [DEBUG] agent: Service "foo-proxy" in sync
2019/12/30 19:04:07 [INFO] proxy: registered Consul service: foo-proxy
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.093597 [DEBUG] agent: Check "foo-proxy-ttl" status is now passing
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:07.201513 [INFO] agent: Synced service "no-sidecar"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.441859 [INFO] agent: Synced check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.441925 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442021 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442072 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442102 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442173 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442220 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442248 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442317 [DEBUG] http: Request PUT /v1/agent/check/fail/foo-proxy-ttl?note= (1.663213201s) from=127.0.0.1:51018
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.442720 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.805214 [INFO] agent: Synced check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.805854 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.809820 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.810005 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.810173 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.810340 [DEBUG] http: Request PUT /v1/agent/check/pass/foo-proxy-ttl?note= (716.799715ms) from=127.0.0.1:51020
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:07.810755 [DEBUG] http: Request GET /v1/agent/checks (366.621076ms) from=127.0.0.1:51018
2019/12/30 19:04:07 [INFO] proxy: stop request received, deregistering
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:08.146223 [INFO] agent: Synced service "one-sidecar"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.331517 [INFO] agent: Deregistered service "foo-proxy"
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:08.807886 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.978753 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.979023 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.979685 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.981830 [DEBUG] agent: removed check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.982038 [DEBUG] agent: removed service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.982173 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.982329 [DEBUG] http: Request PUT /v1/agent/service/deregister/foo-proxy (1.167604362s) from=127.0.0.1:51020
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.983110 [INFO] agent: Requesting shutdown
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.983183 [INFO] consul: shutting down server
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:08.983236 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.178349 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.400810 [INFO] manager: shutting down
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.401455 [INFO] agent: consul server down
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.401521 [INFO] agent: shutdown complete
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.401580 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.401725 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (udp)
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.401899 [INFO] agent: Stopping HTTP server 127.0.0.1:46002 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.402642 [INFO] agent: Waiting for endpoints to shut down
TestRegisterMonitor_heartbeat - 2019/12/30 19:04:09.402746 [INFO] agent: Endpoints down
--- PASS: TestRegisterMonitor_heartbeat (16.78s)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:09.713238 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.647375 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653125 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653222 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653275 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653318 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653359 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653391 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653645 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653701 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653743 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653823 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653908 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.653954 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654034 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654120 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654174 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654260 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654333 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654716 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.654771 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704206 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704284 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704321 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704355 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704428 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704466 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704514 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704557 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704596 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704635 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.704665 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:10.836508 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.028360 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.028481 [ERR] connect: Apply failed leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.028530 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.028738 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.031325 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.031422 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.031489 [INFO] agent: Stopping DNS server 127.0.0.1:46019 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.031683 [INFO] agent: Stopping DNS server 127.0.0.1:46019 (udp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.031872 [INFO] agent: Stopping HTTP server 127.0.0.1:46020 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.032118 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/30 19:04:11.032206 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service_flag_with_-service-addr
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:11.090308 [WARN] agent: Node name "Node 9cbaaf32-8b44-33c8-384c-3e4292d4be99" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:11.090701 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:11.092909 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9cbaaf32-8b44-33c8-384c-3e4292d4be99 Address:127.0.0.1:46030}]
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:46030 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.760004 [INFO] serf: EventMemberJoin: Node 9cbaaf32-8b44-33c8-384c-3e4292d4be99.dc1 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.768622 [INFO] serf: EventMemberJoin: Node 9cbaaf32-8b44-33c8-384c-3e4292d4be99 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.770271 [INFO] consul: Adding LAN server Node 9cbaaf32-8b44-33c8-384c-3e4292d4be99 (Addr: tcp/127.0.0.1:46030) (DC: dc1)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.770806 [INFO] consul: Handled member-join event for server "Node 9cbaaf32-8b44-33c8-384c-3e4292d4be99.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.778656 [INFO] agent: Started DNS server 127.0.0.1:46025 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.778858 [INFO] agent: Started DNS server 127.0.0.1:46025 (udp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.781588 [INFO] agent: Started HTTP server on 127.0.0.1:46026 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:12.781708 [INFO] agent: started state syncer
2019/12/30 19:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:46030 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:13 [INFO]  raft: Node at 127.0.0.1:46030 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:13.395427 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:13.395878 [INFO] consul: New leader elected: Node 9cbaaf32-8b44-33c8-384c-3e4292d4be99
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:13.533733 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:13.533799 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:13.533733 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:13.697407 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:14.048139 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:14.381797 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:14.758586 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:15.432989 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:15.845984 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.244728 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.244848 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.244898 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.244957 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.245010 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.245045 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.245292 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.245355 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.411705 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.529764 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.658090 [ERR] connect: Apply failed leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.658203 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.658490 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.658582 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.658657 [INFO] agent: Stopping DNS server 127.0.0.1:46025 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.658883 [INFO] agent: Stopping DNS server 127.0.0.1:46025 (udp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.659100 [INFO] agent: Stopping HTTP server 127.0.0.1:46026 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.659350 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/30 19:04:16.659495 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service,_-service-addr,_-listen
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:16.735603 [WARN] agent: Node name "Node a9abfe89-2b71-eaae-c279-14955921670a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:16.736003 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:16.738137 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a9abfe89-2b71-eaae-c279-14955921670a Address:127.0.0.1:46036}]
2019/12/30 19:04:18 [INFO]  raft: Node at 127.0.0.1:46036 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.266183 [INFO] serf: EventMemberJoin: Node a9abfe89-2b71-eaae-c279-14955921670a.dc1 127.0.0.1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.276662 [INFO] serf: EventMemberJoin: Node a9abfe89-2b71-eaae-c279-14955921670a 127.0.0.1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.289300 [INFO] consul: Adding LAN server Node a9abfe89-2b71-eaae-c279-14955921670a (Addr: tcp/127.0.0.1:46036) (DC: dc1)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.299821 [INFO] consul: Handled member-join event for server "Node a9abfe89-2b71-eaae-c279-14955921670a.dc1" in area "wan"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.305343 [INFO] agent: Started DNS server 127.0.0.1:46031 (udp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.305443 [INFO] agent: Started DNS server 127.0.0.1:46031 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.310095 [INFO] agent: Started HTTP server on 127.0.0.1:46032 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:18.310215 [INFO] agent: started state syncer
2019/12/30 19:04:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:18 [INFO]  raft: Node at 127.0.0.1:46036 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:19 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:46036 [Leader] entering Leader state
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.047542 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.048090 [INFO] consul: New leader elected: Node a9abfe89-2b71-eaae-c279-14955921670a
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.180655 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.180775 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.180892 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.224288 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:19.691883 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:20.538595 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:20.825440 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:21.387291 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.071081 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.073877 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410570 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410662 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410706 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410748 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410786 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410814 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410913 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410956 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.410995 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411032 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411066 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411099 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411140 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411179 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411217 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411254 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411280 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411490 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.411542 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.495331 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.671148 [INFO] manager: shutting down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.672401 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.672618 [INFO] agent: consul server down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.672670 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.672725 [INFO] agent: Stopping DNS server 127.0.0.1:46031 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.672828 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.672885 [INFO] agent: Stopping DNS server 127.0.0.1:46031 (udp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.673040 [INFO] agent: Stopping HTTP server 127.0.0.1:46032 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.673072 [ERR] consul: failed to reconcile member: {Node a9abfe89-2b71-eaae-c279-14955921670a 127.0.0.1 46034 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:a9abfe89-2b71-eaae-c279-14955921670a port:46036 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:46035] alive 1 5 2 2 5 4}: raft is already shutdown
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.673266 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/30 19:04:22.673347 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_no_sidecar
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:22.763824 [WARN] agent: Node name "Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:22.764267 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:22.766720 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7ca57ee6-402c-2a58-7bf2-0b0683088e18 Address:127.0.0.1:46042}]
2019/12/30 19:04:24 [INFO]  raft: Node at 127.0.0.1:46042 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.130532 [INFO] serf: EventMemberJoin: Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.135170 [INFO] serf: EventMemberJoin: Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.136306 [INFO] consul: Adding LAN server Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18 (Addr: tcp/127.0.0.1:46042) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.136495 [INFO] consul: Handled member-join event for server "Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.140808 [INFO] agent: Started DNS server 127.0.0.1:46037 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.141204 [INFO] agent: Started DNS server 127.0.0.1:46037 (udp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.146570 [INFO] agent: Started HTTP server on 127.0.0.1:46038 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.146689 [INFO] agent: started state syncer
2019/12/30 19:04:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:24 [INFO]  raft: Node at 127.0.0.1:46042 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:24 [INFO]  raft: Node at 127.0.0.1:46042 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.849067 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:24.849596 [INFO] consul: New leader elected: Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:25.008333 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:25.009570 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:25.009689 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:25.331788 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:25.996640 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:26.696954 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.013827 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.265480 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.344584 [WARN] agent: Check "service:two-sidecars-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:21000: connect: connection refused
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.431815 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.598735 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.689781 [WARN] agent: Check "service:one-sidecar-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:9999: connect: connection refused
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.770549 [DEBUG] consul: Skipping self join check for "Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.770885 [INFO] consul: member 'Node 7ca57ee6-402c-2a58-7bf2-0b0683088e18' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.771871 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.771963 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772023 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772077 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772130 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772169 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772318 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772369 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772407 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772444 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772476 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772510 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772552 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772594 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772642 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772713 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.772762 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.775540 [DEBUG] http: Request GET /v1/agent/services (3.753513697s) from=127.0.0.1:35132
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.778618 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.778852 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.778916 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.938700 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990065 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990154 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990201 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990239 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990277 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990313 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990372 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990426 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990466 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990509 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:28.990539 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.048910 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.049693 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.049762 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.049823 [INFO] agent: Stopping DNS server 127.0.0.1:46037 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.049983 [INFO] agent: Stopping DNS server 127.0.0.1:46037 (udp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.050149 [INFO] agent: Stopping HTTP server 127.0.0.1:46038 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.050577 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/30 19:04:29.050663 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:29.158008 [WARN] agent: Node name "Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:29.158417 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:29.160675 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4bf1d373-d665-c1fe-85b1-a1013c73c9eb Address:127.0.0.1:46048}]
2019/12/30 19:04:30 [INFO]  raft: Node at 127.0.0.1:46048 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.217592 [INFO] serf: EventMemberJoin: Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.221436 [INFO] serf: EventMemberJoin: Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.222379 [INFO] consul: Adding LAN server Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb (Addr: tcp/127.0.0.1:46048) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.222630 [INFO] consul: Handled member-join event for server "Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.224363 [INFO] agent: Started DNS server 127.0.0.1:46043 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.224520 [INFO] agent: Started DNS server 127.0.0.1:46043 (udp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.241345 [INFO] agent: Started HTTP server on 127.0.0.1:46044 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.241464 [INFO] agent: started state syncer
2019/12/30 19:04:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:30 [INFO]  raft: Node at 127.0.0.1:46048 [Candidate] entering Candidate state in term 2
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:30.438377 [WARN] agent: Check "service:two-sidecars-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:21000: connect: connection refused
2019/12/30 19:04:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:31 [INFO]  raft: Node at 127.0.0.1:46048 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:31.438638 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:31.439144 [INFO] consul: New leader elected: Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:31.613269 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:31.613543 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:31.613673 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:31.981699 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:32.438379 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:32.838360 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.138549 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.564920 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880547 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880655 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880702 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880742 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880784 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880816 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880931 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.880973 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881013 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881048 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881083 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881119 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881159 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881199 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881237 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881276 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.881304 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.882333 [DEBUG] http: Request GET /v1/agent/services (2.062319436s) from=127.0.0.1:48080
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.883743 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.884213 [DEBUG] consul: Skipping self join check for "Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.885274 [INFO] consul: member 'Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.885413 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.885615 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:33.885667 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.020461 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.145514 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.229129 [ERR] consul: failed to reconcile member: {Node 4bf1d373-d665-c1fe-85b1-a1013c73c9eb 127.0.0.1 46046 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:4bf1d373-d665-c1fe-85b1-a1013c73c9eb port:46048 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:46047] alive 1 5 2 2 5 4}: leadership lost while committing log
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.229258 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.229311 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.229376 [INFO] agent: Stopping DNS server 127.0.0.1:46043 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.229639 [INFO] agent: Stopping DNS server 127.0.0.1:46043 (udp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.229819 [INFO] agent: Stopping HTTP server 127.0.0.1:46044 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.230311 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/30 19:04:34.230392 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_non-existent
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:34.294007 [WARN] agent: Node name "Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:34.294616 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:34.298164 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7f7ebe7b-d5f0-e829-99b1-c7913531045b Address:127.0.0.1:46054}]
2019/12/30 19:04:35 [INFO]  raft: Node at 127.0.0.1:46054 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.416233 [INFO] serf: EventMemberJoin: Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.420065 [INFO] serf: EventMemberJoin: Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.420972 [INFO] consul: Adding LAN server Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b (Addr: tcp/127.0.0.1:46054) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.421238 [INFO] consul: Handled member-join event for server "Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.422383 [INFO] agent: Started DNS server 127.0.0.1:46049 (udp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.422747 [INFO] agent: Started DNS server 127.0.0.1:46049 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.425215 [INFO] agent: Started HTTP server on 127.0.0.1:46050 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:35.425325 [INFO] agent: started state syncer
2019/12/30 19:04:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:35 [INFO]  raft: Node at 127.0.0.1:46054 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:36 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:36 [INFO]  raft: Node at 127.0.0.1:46054 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.037626 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.038095 [INFO] consul: New leader elected: Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.109922 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.109922 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.109939 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.408373 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:36.716039 [WARN] agent: Check "service:two-sidecars-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:21000: connect: connection refused
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.038530 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.490006 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.646733 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.988125 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.988596 [DEBUG] consul: Skipping self join check for "Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.988769 [INFO] consul: member 'Node 7f7ebe7b-d5f0-e829-99b1-c7913531045b' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:37.988856 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.340658 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.340774 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.340825 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.340875 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.340926 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.340959 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341138 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341224 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341272 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341319 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341356 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341393 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341441 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341492 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341532 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341577 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.341608 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.342661 [DEBUG] http: Request GET /v1/agent/services (2.071830349s) from=127.0.0.1:45074
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.345703 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.345926 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.345989 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.420531 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.478939 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.479440 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.479500 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.479567 [INFO] agent: Stopping DNS server 127.0.0.1:46049 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.479732 [INFO] agent: Stopping DNS server 127.0.0.1:46049 (udp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.479907 [INFO] agent: Stopping HTTP server 127.0.0.1:46050 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.480417 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/30 19:04:38.480516 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_one_sidecar
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:38.544367 [WARN] agent: Node name "Node 87dfb271-9d0b-dbe5-7d6a-7e64b942b041" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:38.545102 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:38.551164 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:87dfb271-9d0b-dbe5-7d6a-7e64b942b041 Address:127.0.0.1:46060}]
2019/12/30 19:04:39 [INFO]  raft: Node at 127.0.0.1:46060 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.299793 [INFO] serf: EventMemberJoin: Node 87dfb271-9d0b-dbe5-7d6a-7e64b942b041.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.303205 [INFO] serf: EventMemberJoin: Node 87dfb271-9d0b-dbe5-7d6a-7e64b942b041 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.304029 [INFO] consul: Handled member-join event for server "Node 87dfb271-9d0b-dbe5-7d6a-7e64b942b041.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.304315 [INFO] consul: Adding LAN server Node 87dfb271-9d0b-dbe5-7d6a-7e64b942b041 (Addr: tcp/127.0.0.1:46060) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.315779 [INFO] agent: Started DNS server 127.0.0.1:46055 (udp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.316198 [INFO] agent: Started DNS server 127.0.0.1:46055 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.318447 [INFO] agent: Started HTTP server on 127.0.0.1:46056 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.318540 [INFO] agent: started state syncer
2019/12/30 19:04:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:39 [INFO]  raft: Node at 127.0.0.1:46060 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:39 [INFO]  raft: Node at 127.0.0.1:46060 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.764672 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.765181 [INFO] consul: New leader elected: Node 87dfb271-9d0b-dbe5-7d6a-7e64b942b041
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.856136 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.856324 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.856883 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.880980 [DEBUG] http: Request GET /v1/agent/services (2.181725ms) from=127.0.0.1:57736
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.889261 [DEBUG] http: Request GET /v1/agent/service/one-sidecar-sidecar-proxy (2.828075ms) from=127.0.0.1:57736
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.890653 [DEBUG] http: Request GET /v1/agent/service/one-sidecar-sidecar-proxy (4.328448ms) from=127.0.0.1:57738
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.893211 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.893656 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.893728 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:39.894295 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.025576 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.249135 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.354237 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.354314 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.354373 [INFO] agent: Stopping DNS server 127.0.0.1:46055 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.354590 [INFO] agent: Stopping DNS server 127.0.0.1:46055 (udp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.354744 [INFO] agent: Stopping HTTP server 127.0.0.1:46056 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.355209 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:40.355558 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:41.355162 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:46056 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:41.355262 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/30 19:04:41.355301 [INFO] agent: Endpoints down
--- PASS: TestCommandConfigWatcher (48.73s)
    --- PASS: TestCommandConfigWatcher/-service_flag_only (9.82s)
    --- PASS: TestCommandConfigWatcher/-service_flag_with_upstreams (8.59s)
    --- PASS: TestCommandConfigWatcher/-service_flag_with_-service-addr (5.63s)
    --- PASS: TestCommandConfigWatcher/-service,_-service-addr,_-listen (6.01s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_no_sidecar (6.38s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars (5.18s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_non-existent (4.25s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_one_sidecar (2.87s)
PASS
ok  	github.com/hashicorp/consul/command/connect/proxy	51.265s
=== RUN   TestDebugCommand_noTabs
=== PAUSE TestDebugCommand_noTabs
=== RUN   TestDebugCommand
--- SKIP: TestDebugCommand (0.00s)
    debug_test.go:29: DM-skipped
=== RUN   TestDebugCommand_Archive
=== PAUSE TestDebugCommand_Archive
=== RUN   TestDebugCommand_ArgsBad
=== PAUSE TestDebugCommand_ArgsBad
=== RUN   TestDebugCommand_OutputPathBad
=== PAUSE TestDebugCommand_OutputPathBad
=== RUN   TestDebugCommand_OutputPathExists
=== PAUSE TestDebugCommand_OutputPathExists
=== RUN   TestDebugCommand_CaptureTargets
=== PAUSE TestDebugCommand_CaptureTargets
=== RUN   TestDebugCommand_ProfilesExist
=== PAUSE TestDebugCommand_ProfilesExist
=== RUN   TestDebugCommand_ValidateTiming
=== PAUSE TestDebugCommand_ValidateTiming
=== RUN   TestDebugCommand_DebugDisabled
=== PAUSE TestDebugCommand_DebugDisabled
=== CONT  TestDebugCommand_noTabs
=== CONT  TestDebugCommand_CaptureTargets
=== CONT  TestDebugCommand_DebugDisabled
=== CONT  TestDebugCommand_OutputPathBad
--- PASS: TestDebugCommand_noTabs (0.00s)
=== CONT  TestDebugCommand_ArgsBad
--- PASS: TestDebugCommand_ArgsBad (0.08s)
=== CONT  TestDebugCommand_OutputPathExists
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:18.185419 [WARN] agent: Node name "Node 65e8ca2e-a7ad-ba65-293f-ce804571694f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:18.214322 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:18.251170 [WARN] agent: Node name "Node 5bb7fe1b-999d-b249-dbff-5f3f92d1c121" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:18.251677 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:18.261516 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:18.269036 [WARN] agent: Node name "Node a5d1c16b-fd29-6296-bc1b-6765763b8d70" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:18.269630 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:18.272903 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:18.306787 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:18.321826 [WARN] agent: Node name "Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:18.322534 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:18.325091 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:65e8ca2e-a7ad-ba65-293f-ce804571694f Address:127.0.0.1:40006}]
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
2019/12/30 19:04:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a5d1c16b-fd29-6296-bc1b-6765763b8d70 Address:127.0.0.1:40018}]
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40018 [Follower] entering Follower state (Leader: "")
2019/12/30 19:04:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5bb7fe1b-999d-b249-dbff-5f3f92d1c121 Address:127.0.0.1:40024}]
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40024 [Follower] entering Follower state (Leader: "")
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.401525 [INFO] serf: EventMemberJoin: Node a5d1c16b-fd29-6296-bc1b-6765763b8d70.dc1 127.0.0.1
2019/12/30 19:04:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6d9c3c07-ad6f-8fc4-4676-432f192015bd Address:127.0.0.1:40012}]
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.405018 [INFO] serf: EventMemberJoin: Node 65e8ca2e-a7ad-ba65-293f-ce804571694f.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.405018 [INFO] serf: EventMemberJoin: Node a5d1c16b-fd29-6296-bc1b-6765763b8d70 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.406695 [INFO] serf: EventMemberJoin: Node 5bb7fe1b-999d-b249-dbff-5f3f92d1c121.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.407832 [INFO] agent: Started DNS server 127.0.0.1:40013 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.408934 [INFO] consul: Adding LAN server Node a5d1c16b-fd29-6296-bc1b-6765763b8d70 (Addr: tcp/127.0.0.1:40018) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.409174 [INFO] consul: Handled member-join event for server "Node a5d1c16b-fd29-6296-bc1b-6765763b8d70.dc1" in area "wan"
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40012 [Follower] entering Follower state (Leader: "")
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.413273 [INFO] serf: EventMemberJoin: Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd.dc1 127.0.0.1
2019/12/30 19:04:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40018 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.438121 [INFO] serf: EventMemberJoin: Node 5bb7fe1b-999d-b249-dbff-5f3f92d1c121 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.439327 [INFO] agent: Started DNS server 127.0.0.1:40013 (tcp)
2019/12/30 19:04:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.443026 [INFO] serf: EventMemberJoin: Node 65e8ca2e-a7ad-ba65-293f-ce804571694f 127.0.0.1
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.444532 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.446432 [INFO] agent: Started HTTP server on 127.0.0.1:40014 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:19.446726 [INFO] agent: started state syncer
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.448450 [INFO] serf: EventMemberJoin: Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd 127.0.0.1
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.450152 [INFO] consul: Adding LAN server Node 65e8ca2e-a7ad-ba65-293f-ce804571694f (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.450456 [INFO] consul: Adding LAN server Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd (Addr: tcp/127.0.0.1:40012) (DC: dc1)
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.450519 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.451083 [INFO] consul: Handled member-join event for server "Node 65e8ca2e-a7ad-ba65-293f-ce804571694f.dc1" in area "wan"
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.451759 [INFO] agent: Started DNS server 127.0.0.1:40007 (udp)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.452061 [INFO] consul: Handled member-join event for server "Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd.dc1" in area "wan"
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.452376 [INFO] consul: Adding LAN server Node 5bb7fe1b-999d-b249-dbff-5f3f92d1c121 (Addr: tcp/127.0.0.1:40024) (DC: dc1)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.452550 [INFO] agent: Started DNS server 127.0.0.1:40007 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.452571 [INFO] consul: Handled member-join event for server "Node 5bb7fe1b-999d-b249-dbff-5f3f92d1c121.dc1" in area "wan"
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.453451 [INFO] agent: Started DNS server 127.0.0.1:40019 (udp)
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.453513 [INFO] agent: Started DNS server 127.0.0.1:40019 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.453683 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
2019/12/30 19:04:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:19.453803 [INFO] agent: started state syncer
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40012 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.455982 [INFO] agent: Started HTTP server on 127.0.0.1:40020 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:19.456062 [INFO] agent: started state syncer
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.456859 [INFO] agent: Started HTTP server on 127.0.0.1:40008 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:19.456937 [INFO] agent: started state syncer
2019/12/30 19:04:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:19 [INFO]  raft: Node at 127.0.0.1:40024 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:20 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
2019/12/30 19:04:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:20 [INFO]  raft: Node at 127.0.0.1:40024 [Leader] entering Leader state
2019/12/30 19:04:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:20 [INFO]  raft: Node at 127.0.0.1:40018 [Leader] entering Leader state
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.180971 [INFO] consul: cluster leadership acquired
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.181544 [INFO] consul: New leader elected: Node 5bb7fe1b-999d-b249-dbff-5f3f92d1c121
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:20.182052 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:20.182478 [INFO] consul: New leader elected: Node a5d1c16b-fd29-6296-bc1b-6765763b8d70
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:20.182912 [INFO] consul: cluster leadership acquired
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:20.183394 [INFO] consul: New leader elected: Node 65e8ca2e-a7ad-ba65-293f-ce804571694f
2019/12/30 19:04:20 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:20 [INFO]  raft: Node at 127.0.0.1:40012 [Leader] entering Leader state
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:20.184316 [INFO] consul: cluster leadership acquired
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:20.184920 [INFO] consul: New leader elected: Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.722855 [INFO] agent: Synced node info
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:20.821173 [INFO] agent: Synced node info
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:20.821320 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:20.821208 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:20.827371 [INFO] agent: Synced node info
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:20.859738 [DEBUG] agent: Node info in sync
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.923060 [DEBUG] http: Request GET /v1/agent/self (192.753121ms) from=127.0.0.1:42552
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.940983 [INFO] agent: Requesting shutdown
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.941219 [INFO] consul: shutting down server
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:20.941357 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.053639 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.059271 [DEBUG] http: Request GET /v1/agent/self (227.097366ms) from=127.0.0.1:47194
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.073182 [INFO] agent: Requesting shutdown
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.073298 [INFO] consul: shutting down server
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.073351 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:21.076851 [DEBUG] http: Request GET /v1/agent/self (243.869145ms) from=127.0.0.1:46850
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:21.077791 [DEBUG] http: Request GET /v1/agent/self (244.994841ms) from=127.0.0.1:32798
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.162673 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.162765 [INFO] manager: shutting down
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.163635 [INFO] agent: consul server down
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.163713 [INFO] agent: shutdown complete
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.163772 [INFO] agent: Stopping DNS server 127.0.0.1:40019 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.163919 [INFO] agent: Stopping DNS server 127.0.0.1:40019 (udp)
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.164079 [INFO] agent: Stopping HTTP server 127.0.0.1:40020 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.164621 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.164740 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.164917 [INFO] agent: Endpoints down
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.166098 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDebugCommand_OutputPathExists - 2019/12/30 19:04:21.166184 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
--- PASS: TestDebugCommand_OutputPathExists (3.08s)
=== CONT  TestDebugCommand_ValidateTiming
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:21.256317 [WARN] agent: Node name "Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:21.267847 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:21.271345 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.385718 [INFO] manager: shutting down
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.820909 [INFO] agent: consul server down
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.821014 [INFO] agent: shutdown complete
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.821079 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.821256 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.821437 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.821963 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.822093 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDebugCommand_OutputPathBad - 2019/12/30 19:04:21.822304 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_OutputPathBad (3.81s)
=== CONT  TestDebugCommand_ProfilesExist
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:21.898054 [WARN] agent: Node name "Node 8e846288-4dc6-e189-acf1-cb9c144292a3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:21.898690 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:21.903350 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.151368 [DEBUG] http: Request GET /v1/agent/host (1.059068134s) from=127.0.0.1:46850
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.156618 [DEBUG] http: Request GET /v1/agent/host (1.046175458s) from=127.0.0.1:32798
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.180305 [DEBUG] http: Request GET /v1/agent/self (17.118455ms) from=127.0.0.1:46850
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.192807 [DEBUG] http: Request GET /v1/agent/members?wan=1 (980.36µs) from=127.0.0.1:46850
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.286420 [WARN] agent: Node name "Node 992bfe13-5092-3523-b128-518e1fe76327" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.286845 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.289125 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.496199 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.496724 [DEBUG] consul: Skipping self join check for "Node a5d1c16b-fd29-6296-bc1b-6765763b8d70" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:22.496912 [INFO] consul: member 'Node a5d1c16b-fd29-6296-bc1b-6765763b8d70' joined, marking health alive
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.499892 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.500393 [DEBUG] consul: Skipping self join check for "Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd" since the cluster is too small
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.500562 [INFO] consul: member 'Node 6d9c3c07-ad6f-8fc4-4676-432f192015bd' joined, marking health alive
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.505904 [DEBUG] http: Request GET /v1/agent/self (332.972512ms) from=127.0.0.1:32798
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.520010 [DEBUG] http: Request GET /v1/agent/members?wan=1 (1.097029ms) from=127.0.0.1:32798
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.538171 [DEBUG] http: Request GET /v1/agent/metrics (3.895103ms) from=127.0.0.1:32808
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.650634 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:22.650735 [DEBUG] agent: Node info in sync
2019/12/30 19:04:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:661d6e0b-d8b7-d2b4-e780-06f63c3a470b Address:127.0.0.1:40030}]
2019/12/30 19:04:22 [INFO]  raft: Node at 127.0.0.1:40030 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.676473 [INFO] serf: EventMemberJoin: Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.684041 [INFO] serf: EventMemberJoin: Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.685441 [INFO] consul: Handled member-join event for server "Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.685602 [INFO] agent: Started DNS server 127.0.0.1:40025 (udp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.685895 [INFO] agent: Started DNS server 127.0.0.1:40025 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.685900 [INFO] consul: Adding LAN server Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b (Addr: tcp/127.0.0.1:40030) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.688301 [INFO] agent: Started HTTP server on 127.0.0.1:40026 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:22.688405 [INFO] agent: started state syncer
2019/12/30 19:04:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:22 [INFO]  raft: Node at 127.0.0.1:40030 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.064776 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:23.066726 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8e846288-4dc6-e189-acf1-cb9c144292a3 Address:127.0.0.1:40036}]
2019/12/30 19:04:23 [INFO]  raft: Node at 127.0.0.1:40036 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.234667 [INFO] serf: EventMemberJoin: Node 8e846288-4dc6-e189-acf1-cb9c144292a3.dc1 127.0.0.1
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.240741 [INFO] serf: EventMemberJoin: Node 8e846288-4dc6-e189-acf1-cb9c144292a3 127.0.0.1
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.242683 [INFO] consul: Handled member-join event for server "Node 8e846288-4dc6-e189-acf1-cb9c144292a3.dc1" in area "wan"
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.243268 [INFO] agent: Started DNS server 127.0.0.1:40031 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.243350 [INFO] agent: Started DNS server 127.0.0.1:40031 (udp)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.244106 [INFO] consul: Adding LAN server Node 8e846288-4dc6-e189-acf1-cb9c144292a3 (Addr: tcp/127.0.0.1:40036) (DC: dc1)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.246059 [INFO] agent: Started HTTP server on 127.0.0.1:40032 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:23.246260 [INFO] agent: started state syncer
2019/12/30 19:04:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:23 [INFO]  raft: Node at 127.0.0.1:40036 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.358746 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.358817 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.358884 [DEBUG] agent: Node info in sync
2019/12/30 19:04:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:992bfe13-5092-3523-b128-518e1fe76327 Address:127.0.0.1:40042}]
2019/12/30 19:04:23 [INFO]  raft: Node at 127.0.0.1:40042 [Follower] entering Follower state (Leader: "")
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.509547 [INFO] serf: EventMemberJoin: Node 992bfe13-5092-3523-b128-518e1fe76327.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.513290 [INFO] serf: EventMemberJoin: Node 992bfe13-5092-3523-b128-518e1fe76327 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.514203 [INFO] consul: Adding LAN server Node 992bfe13-5092-3523-b128-518e1fe76327 (Addr: tcp/127.0.0.1:40042) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.514929 [INFO] consul: Handled member-join event for server "Node 992bfe13-5092-3523-b128-518e1fe76327.dc1" in area "wan"
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.516334 [INFO] agent: Started DNS server 127.0.0.1:40037 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.516541 [INFO] agent: Started DNS server 127.0.0.1:40037 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.519573 [INFO] agent: Started HTTP server on 127.0.0.1:40038 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:23.519685 [INFO] agent: started state syncer
2019/12/30 19:04:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:23 [INFO]  raft: Node at 127.0.0.1:40042 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:23 [INFO]  raft: Node at 127.0.0.1:40030 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:23.603831 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:23.604478 [INFO] consul: New leader elected: Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:24.122022 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:24.195844 [WARN] agent: Node name "Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:24.196215 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:24.198453 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:24 [INFO]  raft: Node at 127.0.0.1:40036 [Leader] entering Leader state
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:24.231578 [INFO] consul: cluster leadership acquired
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:24.232131 [INFO] consul: New leader elected: Node 8e846288-4dc6-e189-acf1-cb9c144292a3
2019/12/30 19:04:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:24 [INFO]  raft: Node at 127.0.0.1:40042 [Leader] entering Leader state
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:24.412996 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:24.413440 [INFO] consul: New leader elected: Node 992bfe13-5092-3523-b128-518e1fe76327
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:24.714164 [INFO] agent: Synced node info
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:24.714271 [DEBUG] agent: Node info in sync
/tmp/consul-test/TestDebugCommand_ProfilesExist-debug219725728/debug
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:24.941176 [DEBUG] http: Request GET /v1/agent/self (212.021965ms) from=127.0.0.1:35776
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.332915 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.333086 [DEBUG] agent: Node info in sync
2019/12/30 19:04:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0579eaa5-17ce-8bb2-ef4c-1226fbac8322 Address:127.0.0.1:40048}]
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.421302 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.422187 [DEBUG] consul: Skipping self join check for "Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b" since the cluster is too small
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.422448 [INFO] consul: member 'Node 661d6e0b-d8b7-d2b4-e780-06f63c3a470b' joined, marking health alive
2019/12/30 19:04:25 [INFO]  raft: Node at 127.0.0.1:40048 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.429149 [INFO] serf: EventMemberJoin: Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.434236 [INFO] serf: EventMemberJoin: Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.436070 [INFO] consul: Handled member-join event for server "Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.436976 [INFO] agent: Started DNS server 127.0.0.1:40043 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.437089 [INFO] agent: Started DNS server 127.0.0.1:40043 (udp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.439877 [INFO] agent: Started HTTP server on 127.0.0.1:40044 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.440099 [INFO] agent: started state syncer
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:25.443344 [INFO] consul: Adding LAN server Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322 (Addr: tcp/127.0.0.1:40048) (DC: dc1)
2019/12/30 19:04:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:25 [INFO]  raft: Node at 127.0.0.1:40048 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.531055 [DEBUG] http: Request GET /v1/agent/self (171.807896ms) from=127.0.0.1:52014
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.538963 [INFO] agent: Requesting shutdown
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.563563 [INFO] consul: shutting down server
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.563744 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.578318 [DEBUG] http: Request GET /v1/agent/metrics (784.688µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.634984 [DEBUG] http: Request GET /v1/agent/metrics (913.358µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.691740 [DEBUG] http: Request GET /v1/agent/metrics (1.168364ms) from=127.0.0.1:52014
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.735097 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.747814 [DEBUG] http: Request GET /v1/agent/metrics (1.2667ms) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.802852 [DEBUG] http: Request GET /v1/agent/metrics (647.351µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.859823 [DEBUG] http: Request GET /v1/agent/metrics (2.443398ms) from=127.0.0.1:52014
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.904908 [INFO] manager: shutting down
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.906216 [INFO] agent: consul server down
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.906310 [INFO] agent: shutdown complete
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.906452 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.906777 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (udp)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:25.907110 [INFO] agent: Stopping HTTP server 127.0.0.1:40008 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.915390 [DEBUG] http: Request GET /v1/agent/metrics (697.019µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:25.970964 [DEBUG] http: Request GET /v1/agent/metrics (761.687µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.026577 [DEBUG] http: Request GET /v1/agent/metrics (564.015µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.080776 [DEBUG] http: Request GET /v1/agent/metrics (536.014µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.134548 [DEBUG] http: Request GET /v1/agent/metrics (488.346µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.191696 [DEBUG] http: Request GET /v1/agent/metrics (734.686µs) from=127.0.0.1:52014
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.201428 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.201511 [DEBUG] agent: Node info in sync
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.201590 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.261621 [DEBUG] http: Request GET /v1/agent/metrics (643.017µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.323249 [DEBUG] http: Request GET /v1/agent/metrics (1.227366ms) from=127.0.0.1:52014
2019/12/30 19:04:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:26 [INFO]  raft: Node at 127.0.0.1:40048 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.331715 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.332863 [INFO] consul: New leader elected: Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.404896 [DEBUG] http: Request GET /v1/agent/metrics (723.686µs) from=127.0.0.1:52014
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.414841 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:26.421164 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:26.421944 [DEBUG] consul: Skipping self join check for "Node 8e846288-4dc6-e189-acf1-cb9c144292a3" since the cluster is too small
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:26.422207 [INFO] consul: member 'Node 8e846288-4dc6-e189-acf1-cb9c144292a3' joined, marking health alive
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.463159 [DEBUG] http: Request GET /v1/agent/metrics (579.015µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.518726 [DEBUG] http: Request GET /v1/agent/metrics (646.684µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.533198 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.533921 [DEBUG] consul: Skipping self join check for "Node 992bfe13-5092-3523-b128-518e1fe76327" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.534129 [INFO] consul: member 'Node 992bfe13-5092-3523-b128-518e1fe76327' joined, marking health alive
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.578425 [DEBUG] http: Request GET /v1/agent/metrics (5.243806ms) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.639212 [DEBUG] http: Request GET /v1/agent/metrics (594.682µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.694977 [DEBUG] http: Request GET /v1/agent/metrics (975.359µs) from=127.0.0.1:52014
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.701898 [INFO] agent: Synced node info
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.702616 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.758214 [DEBUG] http: Request GET /v1/agent/metrics (2.608736ms) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:26.817601 [DEBUG] http: Request GET /v1/agent/metrics (656.018µs) from=127.0.0.1:52014
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.829653 [WARN] agent: Node name "Node e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.836542 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:26.839304 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:26.907615 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:40008 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:26.907749 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_DebugDisabled - 2019/12/30 19:04:26.907838 [INFO] agent: Endpoints down
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:26.940644 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
--- PASS: TestDebugCommand_DebugDisabled (8.95s)
=== CONT  TestDebugCommand_Archive
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.007694 [DEBUG] http: Request GET /v1/agent/metrics (897.024µs) from=127.0.0.1:52014
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_Archive - 2019/12/30 19:04:27.062039 [WARN] agent: Node name "Node 9bee9059-7ba0-9fad-9eb6-fe577e165d69" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.063432 [DEBUG] http: Request GET /v1/agent/metrics (653.684µs) from=127.0.0.1:52014
TestDebugCommand_Archive - 2019/12/30 19:04:27.065566 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_Archive - 2019/12/30 19:04:27.068562 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.118847 [DEBUG] http: Request GET /v1/agent/metrics (599.683µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.174135 [DEBUG] http: Request GET /v1/agent/metrics (538.015µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.230756 [DEBUG] http: Request GET /v1/agent/metrics (694.019µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.285374 [DEBUG] http: Request GET /v1/agent/metrics (719.686µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.333767 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.339730 [DEBUG] http: Request GET /v1/agent/metrics (963.692µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.394845 [DEBUG] http: Request GET /v1/agent/metrics (678.352µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.458091 [DEBUG] http: Request GET /v1/agent/metrics (635.35µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.515482 [DEBUG] http: Request GET /v1/agent/metrics (540.681µs) from=127.0.0.1:52014
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:27.567322 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:27.567452 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.570952 [DEBUG] http: Request GET /v1/agent/metrics (764.354µs) from=127.0.0.1:52014
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.626336 [DEBUG] http: Request GET /v1/agent/metrics (720.685µs) from=127.0.0.1:52014
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.745003 [WARN] agent: Node name "Node 73776a07-fd31-5b73-b848-871de9134bc9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.745656 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.748257 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.780118 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:27.780264 [DEBUG] agent: Node info in sync
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:27.962261 [INFO] agent: Requesting shutdown
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:27.962706 [INFO] consul: shutting down server
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:27.963153 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.095346 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.179043 [INFO] manager: shutting down
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.180643 [INFO] agent: consul server down
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.180757 [INFO] agent: shutdown complete
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.180911 [INFO] agent: Stopping DNS server 127.0.0.1:40031 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.183809 [INFO] agent: Stopping DNS server 127.0.0.1:40031 (udp)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.184185 [INFO] agent: Stopping HTTP server 127.0.0.1:40032 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.684937 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ProfilesExist - 2019/12/30 19:04:28.685019 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_ProfilesExist (6.87s)
2019/12/30 19:04:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9bee9059-7ba0-9fad-9eb6-fe577e165d69 Address:127.0.0.1:40060}]
2019/12/30 19:04:28 [INFO]  raft: Node at 127.0.0.1:40060 [Follower] entering Follower state (Leader: "")
TestDebugCommand_Archive - 2019/12/30 19:04:28.950697 [INFO] serf: EventMemberJoin: Node 9bee9059-7ba0-9fad-9eb6-fe577e165d69.dc1 127.0.0.1
TestDebugCommand_Archive - 2019/12/30 19:04:28.969184 [INFO] serf: EventMemberJoin: Node 9bee9059-7ba0-9fad-9eb6-fe577e165d69 127.0.0.1
TestDebugCommand_Archive - 2019/12/30 19:04:28.970840 [INFO] consul: Adding LAN server Node 9bee9059-7ba0-9fad-9eb6-fe577e165d69 (Addr: tcp/127.0.0.1:40060) (DC: dc1)
TestDebugCommand_Archive - 2019/12/30 19:04:28.971614 [INFO] consul: Handled member-join event for server "Node 9bee9059-7ba0-9fad-9eb6-fe577e165d69.dc1" in area "wan"
TestDebugCommand_Archive - 2019/12/30 19:04:28.973225 [INFO] agent: Started DNS server 127.0.0.1:40055 (tcp)
TestDebugCommand_Archive - 2019/12/30 19:04:28.973635 [INFO] agent: Started DNS server 127.0.0.1:40055 (udp)
TestDebugCommand_Archive - 2019/12/30 19:04:28.976411 [INFO] agent: Started HTTP server on 127.0.0.1:40056 (tcp)
TestDebugCommand_Archive - 2019/12/30 19:04:28.976552 [INFO] agent: started state syncer
2019/12/30 19:04:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:28 [INFO]  raft: Node at 127.0.0.1:40060 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a Address:127.0.0.1:40054}]
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.046374 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.046816 [DEBUG] consul: Skipping self join check for "Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322" since the cluster is too small
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.046998 [INFO] consul: member 'Node 0579eaa5-17ce-8bb2-ef4c-1226fbac8322' joined, marking health alive
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40054 [Follower] entering Follower state (Leader: "")
2019/12/30 19:04:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:73776a07-fd31-5b73-b848-871de9134bc9 Address:127.0.0.1:40066}]
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.056163 [INFO] serf: EventMemberJoin: Node e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.057182 [INFO] serf: EventMemberJoin: Node 73776a07-fd31-5b73-b848-871de9134bc9.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.059579 [INFO] serf: EventMemberJoin: Node e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.060874 [INFO] agent: Started DNS server 127.0.0.1:40049 (udp)
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40066 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.061964 [INFO] consul: Adding LAN server Node e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a (Addr: tcp/127.0.0.1:40054) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.062620 [INFO] agent: Started DNS server 127.0.0.1:40049 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.062138 [INFO] consul: Handled member-join event for server "Node e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a.dc1" in area "wan"
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.066776 [INFO] serf: EventMemberJoin: Node 73776a07-fd31-5b73-b848-871de9134bc9 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.069532 [INFO] consul: Adding LAN server Node 73776a07-fd31-5b73-b848-871de9134bc9 (Addr: tcp/127.0.0.1:40066) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.069791 [INFO] consul: Handled member-join event for server "Node 73776a07-fd31-5b73-b848-871de9134bc9.dc1" in area "wan"
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.070774 [INFO] agent: Started DNS server 127.0.0.1:40061 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.070966 [INFO] agent: Started DNS server 127.0.0.1:40061 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.073292 [INFO] agent: Started HTTP server on 127.0.0.1:40062 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.073549 [INFO] agent: started state syncer
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.080339 [INFO] agent: Started HTTP server on 127.0.0.1:40050 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.080479 [INFO] agent: started state syncer
2019/12/30 19:04:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40054 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40066 [Candidate] entering Candidate state in term 2
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.330304 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.330386 [DEBUG] agent: Node info in sync
2019/12/30 19:04:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40060 [Leader] entering Leader state
TestDebugCommand_Archive - 2019/12/30 19:04:29.713246 [INFO] consul: cluster leadership acquired
TestDebugCommand_Archive - 2019/12/30 19:04:29.713730 [INFO] consul: New leader elected: Node 9bee9059-7ba0-9fad-9eb6-fe577e165d69
2019/12/30 19:04:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40054 [Leader] entering Leader state
2019/12/30 19:04:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:29 [INFO]  raft: Node at 127.0.0.1:40066 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.805935 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:29.806385 [INFO] consul: New leader elected: Node e02ea9eb-476a-c13e-c1dd-2fac7cbc1b7a
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.806477 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:29.806789 [INFO] consul: New leader elected: Node 73776a07-fd31-5b73-b848-871de9134bc9
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.019934 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_Archive - 2019/12/30 19:04:30.096846 [INFO] agent: Synced node info
TestDebugCommand_Archive - 2019/12/30 19:04:30.097029 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.204780 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.204954 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.250109 [DEBUG] agent: Node info in sync
TestDebugCommand_Archive - 2019/12/30 19:04:30.309786 [DEBUG] http: Request GET /v1/agent/self (183.967886ms) from=127.0.0.1:36818
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.412973 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.416175 [DEBUG] http: Request GET /v1/agent/self (202.602047ms) from=127.0.0.1:59736
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.445765 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.446023 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.446135 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.466143 [DEBUG] http: Request GET /v1/agent/host (36.609639ms) from=127.0.0.1:59736
TestDebugCommand_Archive - 2019/12/30 19:04:30.527711 [DEBUG] http: Request GET /v1/agent/self (201.368348ms) from=127.0.0.1:36818
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.620380 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.711396 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:30.712001 [ERR] agent: failed to sync remote state: No cluster leader
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.722645 [DEBUG] http: Request GET /v1/agent/self (251.340008ms) from=127.0.0.1:59736
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.738245 [DEBUG] http: Request GET /v1/agent/members?wan=1 (1.008027ms) from=127.0.0.1:59736
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:30.847639 [DEBUG] http: Request GET /v1/agent/metrics (3.423425ms) from=127.0.0.1:59736
TestDebugCommand_Archive - 2019/12/30 19:04:31.214650 [INFO] agent: Requesting shutdown
TestDebugCommand_Archive - 2019/12/30 19:04:31.214768 [INFO] consul: shutting down server
TestDebugCommand_Archive - 2019/12/30 19:04:31.214829 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.345443 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.345653 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.345696 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.345731 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.345740 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.345881 [INFO] agent: Stopping DNS server 127.0.0.1:40049 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.346080 [INFO] agent: Stopping DNS server 127.0.0.1:40049 (udp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.346268 [INFO] agent: Stopping HTTP server 127.0.0.1:40050 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.346527 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.346611 [INFO] agent: Endpoints down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.347354 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.347437 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.347490 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/30 19:04:31.348661 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/30 19:04:31.437319 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.437695 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/30 19:04:31.537132 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.537186 [INFO] manager: shutting down
TestDebugCommand_Archive - 2019/12/30 19:04:31.537438 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDebugCommand_Archive - 2019/12/30 19:04:31.537478 [INFO] agent: consul server down
TestDebugCommand_Archive - 2019/12/30 19:04:31.537507 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDebugCommand_Archive - 2019/12/30 19:04:31.537574 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDebugCommand_Archive - 2019/12/30 19:04:31.537638 [ERR] consul: failed to transfer leadership in 3 attempts
TestDebugCommand_Archive - 2019/12/30 19:04:31.537528 [INFO] agent: shutdown complete
TestDebugCommand_Archive - 2019/12/30 19:04:31.537809 [INFO] agent: Stopping DNS server 127.0.0.1:40055 (tcp)
TestDebugCommand_Archive - 2019/12/30 19:04:31.538005 [INFO] agent: Stopping DNS server 127.0.0.1:40055 (udp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538064 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538113 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538169 [INFO] agent: Stopping DNS server 127.0.0.1:40043 (tcp)
TestDebugCommand_Archive - 2019/12/30 19:04:31.538190 [INFO] agent: Stopping HTTP server 127.0.0.1:40056 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538334 [INFO] agent: Stopping DNS server 127.0.0.1:40043 (udp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538489 [INFO] agent: Stopping HTTP server 127.0.0.1:40044 (tcp)
TestDebugCommand_Archive - 2019/12/30 19:04:31.538632 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538720 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_Archive - 2019/12/30 19:04:31.538741 [INFO] agent: Endpoints down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.538785 [INFO] agent: Endpoints down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.539696 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.539846 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.539912 [WARN] serf: Shutdown without a Leave
--- PASS: TestDebugCommand_Archive (4.58s)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.695383 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:31.796074 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:31.796553 [DEBUG] consul: Skipping self join check for "Node 73776a07-fd31-5b73-b848-871de9134bc9" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:31.796736 [INFO] consul: member 'Node 73776a07-fd31-5b73-b848-871de9134bc9' joined, marking health alive
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.887311 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888145 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888220 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888283 [INFO] agent: Stopping DNS server 127.0.0.1:40025 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888452 [INFO] agent: Stopping DNS server 127.0.0.1:40025 (udp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888629 [INFO] agent: Stopping HTTP server 127.0.0.1:40026 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888904 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/12/30 19:04:31.888962 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_ValidateTiming (10.72s)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:32.414756 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:32.944753 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:32.944847 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:32.944896 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.053638 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.237148 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.237857 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.237923 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.238007 [INFO] agent: Stopping DNS server 127.0.0.1:40061 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.238169 [INFO] agent: Stopping DNS server 127.0.0.1:40061 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:33.238339 [INFO] agent: Stopping HTTP server 127.0.0.1:40062 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.238761 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:40062 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.238859 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.238901 [INFO] agent: Endpoints down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.243143 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.243264 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.243321 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.295412 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.371498 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.371645 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.371751 [INFO] agent: Stopping DNS server 127.0.0.1:40037 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.371985 [INFO] agent: Stopping DNS server 127.0.0.1:40037 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.372186 [INFO] agent: Stopping HTTP server 127.0.0.1:40038 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.372679 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.372737 [INFO] agent: Endpoints down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.375590 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.381115 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.381218 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.381327 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.487166 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.570612 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.571347 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.571412 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.571474 [INFO] agent: Stopping DNS server 127.0.0.1:40013 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.571632 [INFO] agent: Stopping DNS server 127.0.0.1:40013 (udp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.571807 [INFO] agent: Stopping HTTP server 127.0.0.1:40014 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.572287 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/12/30 19:04:34.572342 [INFO] agent: Endpoints down
--- FAIL: TestDebugCommand_CaptureTargets (16.57s)
    debug_test.go:317: all-but-pprof: output data should exist for */consul.log
FAIL
FAIL	github.com/hashicorp/consul/command/debug	16.789s
=== RUN   TestEventCommand_noTabs
=== PAUSE TestEventCommand_noTabs
=== RUN   TestEventCommand
=== PAUSE TestEventCommand
=== CONT  TestEventCommand_noTabs
=== CONT  TestEventCommand
--- PASS: TestEventCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestEventCommand - 2019/12/30 19:04:09.644681 [WARN] agent: Node name "Node 32533466-0c5c-9da2-c208-e356a5cdab4b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventCommand - 2019/12/30 19:04:09.645770 [DEBUG] tlsutil: Update with version 1
TestEventCommand - 2019/12/30 19:04:09.655956 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:32533466-0c5c-9da2-c208-e356a5cdab4b Address:127.0.0.1:25006}]
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:25006 [Follower] entering Follower state (Leader: "")
TestEventCommand - 2019/12/30 19:04:12.182367 [INFO] serf: EventMemberJoin: Node 32533466-0c5c-9da2-c208-e356a5cdab4b.dc1 127.0.0.1
TestEventCommand - 2019/12/30 19:04:12.191462 [INFO] serf: EventMemberJoin: Node 32533466-0c5c-9da2-c208-e356a5cdab4b 127.0.0.1
TestEventCommand - 2019/12/30 19:04:12.192647 [INFO] consul: Adding LAN server Node 32533466-0c5c-9da2-c208-e356a5cdab4b (Addr: tcp/127.0.0.1:25006) (DC: dc1)
TestEventCommand - 2019/12/30 19:04:12.193196 [INFO] consul: Handled member-join event for server "Node 32533466-0c5c-9da2-c208-e356a5cdab4b.dc1" in area "wan"
TestEventCommand - 2019/12/30 19:04:12.193531 [INFO] agent: Started DNS server 127.0.0.1:25001 (udp)
TestEventCommand - 2019/12/30 19:04:12.193847 [INFO] agent: Started DNS server 127.0.0.1:25001 (tcp)
TestEventCommand - 2019/12/30 19:04:12.197260 [INFO] agent: Started HTTP server on 127.0.0.1:25002 (tcp)
TestEventCommand - 2019/12/30 19:04:12.198259 [INFO] agent: started state syncer
2019/12/30 19:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:25006 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:12 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:25006 [Leader] entering Leader state
TestEventCommand - 2019/12/30 19:04:12.948351 [INFO] consul: cluster leadership acquired
TestEventCommand - 2019/12/30 19:04:12.948839 [INFO] consul: New leader elected: Node 32533466-0c5c-9da2-c208-e356a5cdab4b
TestEventCommand - 2019/12/30 19:04:13.772944 [INFO] agent: Synced node info
TestEventCommand - 2019/12/30 19:04:13.773159 [DEBUG] agent: Node info in sync
TestEventCommand - 2019/12/30 19:04:13.786134 [DEBUG] http: Request GET /v1/agent/self (764.803655ms) from=127.0.0.1:55990
TestEventCommand - 2019/12/30 19:04:13.806372 [DEBUG] http: Request PUT /v1/event/fire/cmd (1.584709ms) from=127.0.0.1:55990
TestEventCommand - 2019/12/30 19:04:13.806685 [DEBUG] consul: User event: cmd
TestEventCommand - 2019/12/30 19:04:13.808778 [DEBUG] agent: new event: cmd (fa037d2d-a39b-fc61-37f6-d0fa709adced)
TestEventCommand - 2019/12/30 19:04:13.809194 [INFO] agent: Requesting shutdown
TestEventCommand - 2019/12/30 19:04:13.809270 [INFO] consul: shutting down server
TestEventCommand - 2019/12/30 19:04:13.809313 [WARN] serf: Shutdown without a Leave
TestEventCommand - 2019/12/30 19:04:13.961609 [WARN] serf: Shutdown without a Leave
TestEventCommand - 2019/12/30 19:04:14.045149 [INFO] manager: shutting down
TestEventCommand - 2019/12/30 19:04:14.050118 [INFO] agent: consul server down
TestEventCommand - 2019/12/30 19:04:14.050199 [INFO] agent: shutdown complete
TestEventCommand - 2019/12/30 19:04:14.050264 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (tcp)
TestEventCommand - 2019/12/30 19:04:14.050446 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (udp)
TestEventCommand - 2019/12/30 19:04:14.050619 [INFO] agent: Stopping HTTP server 127.0.0.1:25002 (tcp)
TestEventCommand - 2019/12/30 19:04:14.051316 [INFO] agent: Waiting for endpoints to shut down
TestEventCommand - 2019/12/30 19:04:14.051476 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestEventCommand - 2019/12/30 19:04:14.051620 [INFO] agent: Endpoints down
--- PASS: TestEventCommand (4.50s)
TestEventCommand - 2019/12/30 19:04:14.051649 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
PASS
ok  	github.com/hashicorp/consul/command/event	4.863s
=== RUN   TestExecCommand_noTabs
=== PAUSE TestExecCommand_noTabs
=== RUN   TestExecCommand
=== PAUSE TestExecCommand
=== RUN   TestExecCommand_NoShell
=== PAUSE TestExecCommand_NoShell
=== RUN   TestExecCommand_CrossDC
--- SKIP: TestExecCommand_CrossDC (0.00s)
    exec_test.go:70: DM-skipped
=== RUN   TestExecCommand_Validate
=== PAUSE TestExecCommand_Validate
=== RUN   TestExecCommand_Sessions
=== PAUSE TestExecCommand_Sessions
=== RUN   TestExecCommand_Sessions_Foreign
=== PAUSE TestExecCommand_Sessions_Foreign
=== RUN   TestExecCommand_UploadDestroy
=== PAUSE TestExecCommand_UploadDestroy
=== RUN   TestExecCommand_StreamResults
=== PAUSE TestExecCommand_StreamResults
=== CONT  TestExecCommand_noTabs
=== CONT  TestExecCommand_Sessions_Foreign
=== CONT  TestExecCommand_StreamResults
=== CONT  TestExecCommand_UploadDestroy
--- PASS: TestExecCommand_noTabs (0.00s)
=== CONT  TestExecCommand_Validate
--- PASS: TestExecCommand_Validate (0.00s)
=== CONT  TestExecCommand_Sessions
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:09.629045 [WARN] agent: Node name "Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:09.636052 [DEBUG] tlsutil: Update with version 1
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:09.643942 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_StreamResults - 2019/12/30 19:04:09.658455 [WARN] agent: Node name "Node 0c530816-fb91-65af-16f8-9ba13c6ddcae" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions - 2019/12/30 19:04:09.658809 [WARN] agent: Node name "Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_StreamResults - 2019/12/30 19:04:09.658895 [DEBUG] tlsutil: Update with version 1
TestExecCommand_UploadDestroy - 2019/12/30 19:04:09.659476 [WARN] agent: Node name "Node b0343907-6f83-2c03-e458-e322a60af5c4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions - 2019/12/30 19:04:09.659599 [DEBUG] tlsutil: Update with version 1
TestExecCommand_UploadDestroy - 2019/12/30 19:04:09.660177 [DEBUG] tlsutil: Update with version 1
TestExecCommand_StreamResults - 2019/12/30 19:04:09.661670 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions - 2019/12/30 19:04:09.662626 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_UploadDestroy - 2019/12/30 19:04:09.663429 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2a561ca5-d95e-ed50-2bea-52289c7ae1d4 Address:127.0.0.1:23524}]
2019/12/30 19:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b0343907-6f83-2c03-e458-e322a60af5c4 Address:127.0.0.1:23518}]
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23524 [Follower] entering Follower state (Leader: "")
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23518 [Follower] entering Follower state (Leader: "")
2019/12/30 19:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e4e664e1-99cc-3b64-ecb5-315e7727c9aa Address:127.0.0.1:23506}]
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.176714 [INFO] serf: EventMemberJoin: Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa.dc1 127.0.0.1
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.180554 [INFO] serf: EventMemberJoin: Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa 127.0.0.1
TestExecCommand_Sessions - 2019/12/30 19:04:12.181127 [INFO] serf: EventMemberJoin: Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4.dc1 127.0.0.1
2019/12/30 19:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0c530816-fb91-65af-16f8-9ba13c6ddcae Address:127.0.0.1:23512}]
TestExecCommand_Sessions - 2019/12/30 19:04:12.185902 [INFO] serf: EventMemberJoin: Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4 127.0.0.1
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23512 [Follower] entering Follower state (Leader: "")
TestExecCommand_Sessions - 2019/12/30 19:04:12.187618 [INFO] agent: Started DNS server 127.0.0.1:23519 (udp)
TestExecCommand_Sessions - 2019/12/30 19:04:12.188836 [INFO] consul: Adding LAN server Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4 (Addr: tcp/127.0.0.1:23524) (DC: dc1)
TestExecCommand_Sessions - 2019/12/30 19:04:12.189180 [INFO] consul: Handled member-join event for server "Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4.dc1" in area "wan"
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.193188 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.197374 [INFO] consul: Adding LAN server Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.197616 [INFO] consul: Handled member-join event for server "Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa.dc1" in area "wan"
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.198509 [INFO] serf: EventMemberJoin: Node b0343907-6f83-2c03-e458-e322a60af5c4.dc1 127.0.0.1
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.209741 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.212606 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:12.212776 [INFO] agent: started state syncer
TestExecCommand_Sessions - 2019/12/30 19:04:12.213627 [INFO] agent: Started DNS server 127.0.0.1:23519 (tcp)
TestExecCommand_StreamResults - 2019/12/30 19:04:12.215561 [INFO] serf: EventMemberJoin: Node 0c530816-fb91-65af-16f8-9ba13c6ddcae.dc1 127.0.0.1
TestExecCommand_Sessions - 2019/12/30 19:04:12.216430 [INFO] agent: Started HTTP server on 127.0.0.1:23520 (tcp)
TestExecCommand_Sessions - 2019/12/30 19:04:12.216666 [INFO] agent: started state syncer
2019/12/30 19:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
TestExecCommand_StreamResults - 2019/12/30 19:04:12.220628 [INFO] serf: EventMemberJoin: Node 0c530816-fb91-65af-16f8-9ba13c6ddcae 127.0.0.1
TestExecCommand_StreamResults - 2019/12/30 19:04:12.221466 [INFO] consul: Handled member-join event for server "Node 0c530816-fb91-65af-16f8-9ba13c6ddcae.dc1" in area "wan"
TestExecCommand_StreamResults - 2019/12/30 19:04:12.221759 [INFO] consul: Adding LAN server Node 0c530816-fb91-65af-16f8-9ba13c6ddcae (Addr: tcp/127.0.0.1:23512) (DC: dc1)
TestExecCommand_StreamResults - 2019/12/30 19:04:12.221966 [INFO] agent: Started DNS server 127.0.0.1:23507 (udp)
TestExecCommand_StreamResults - 2019/12/30 19:04:12.222277 [INFO] agent: Started DNS server 127.0.0.1:23507 (tcp)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.223912 [INFO] serf: EventMemberJoin: Node b0343907-6f83-2c03-e458-e322a60af5c4 127.0.0.1
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.227842 [INFO] agent: Started DNS server 127.0.0.1:23513 (udp)
TestExecCommand_StreamResults - 2019/12/30 19:04:12.228077 [INFO] agent: Started HTTP server on 127.0.0.1:23508 (tcp)
TestExecCommand_StreamResults - 2019/12/30 19:04:12.228171 [INFO] agent: started state syncer
2019/12/30 19:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23524 [Candidate] entering Candidate state in term 2
2019/12/30 19:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23518 [Candidate] entering Candidate state in term 2
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.241719 [INFO] consul: Adding LAN server Node b0343907-6f83-2c03-e458-e322a60af5c4 (Addr: tcp/127.0.0.1:23518) (DC: dc1)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.246519 [INFO] consul: Handled member-join event for server "Node b0343907-6f83-2c03-e458-e322a60af5c4.dc1" in area "wan"
2019/12/30 19:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:12 [INFO]  raft: Node at 127.0.0.1:23512 [Candidate] entering Candidate state in term 2
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.252470 [INFO] agent: Started DNS server 127.0.0.1:23513 (tcp)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.261188 [INFO] agent: Started HTTP server on 127.0.0.1:23514 (tcp)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:12.261307 [INFO] agent: started state syncer
2019/12/30 19:04:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:13 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
2019/12/30 19:04:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:13 [INFO]  raft: Node at 127.0.0.1:23518 [Leader] entering Leader state
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.039304 [INFO] consul: cluster leadership acquired
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.039942 [INFO] consul: New leader elected: Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa
2019/12/30 19:04:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:13 [INFO]  raft: Node at 127.0.0.1:23512 [Leader] entering Leader state
TestExecCommand_StreamResults - 2019/12/30 19:04:13.041089 [INFO] consul: cluster leadership acquired
TestExecCommand_UploadDestroy - 2019/12/30 19:04:13.041454 [INFO] consul: cluster leadership acquired
TestExecCommand_StreamResults - 2019/12/30 19:04:13.041658 [INFO] consul: New leader elected: Node 0c530816-fb91-65af-16f8-9ba13c6ddcae
TestExecCommand_UploadDestroy - 2019/12/30 19:04:13.041818 [INFO] consul: New leader elected: Node b0343907-6f83-2c03-e458-e322a60af5c4
2019/12/30 19:04:13 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:13 [INFO]  raft: Node at 127.0.0.1:23524 [Leader] entering Leader state
TestExecCommand_Sessions - 2019/12/30 19:04:13.044480 [INFO] consul: cluster leadership acquired
TestExecCommand_Sessions - 2019/12/30 19:04:13.044878 [INFO] consul: New leader elected: Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.161277 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (4.045107ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.189794 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.071362ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.218029 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.016694ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.260709 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (15.409409ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.289965 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.602376ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.318015 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (716.685µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.345845 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (656.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.373592 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (651.35µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.395812 [INFO] agent: Synced node info
TestExecCommand_Sessions - 2019/12/30 19:04:13.395812 [INFO] agent: Synced node info
TestExecCommand_Sessions - 2019/12/30 19:04:13.396009 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/12/30 19:04:13.397482 [INFO] agent: Synced node info
TestExecCommand_UploadDestroy - 2019/12/30 19:04:13.400350 [INFO] agent: Synced node info
TestExecCommand_UploadDestroy - 2019/12/30 19:04:13.400453 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.406558 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.886716ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.436365 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (2.620737ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.464266 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (645.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.492305 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (657.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.521208 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.821382ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.549775 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (711.019µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.577928 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (679.018µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.606195 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (704.019µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.633894 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (654.017µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.661837 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (728.353µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.689672 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (771.021µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.732745 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (877.69µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.760700 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (676.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.789067 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (748.02µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.817259 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (737.686µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.845384 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (669.351µs) from=127.0.0.1:37900
TestExecCommand_StreamResults - 2019/12/30 19:04:13.856817 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/12/30 19:04:13.856914 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.874264 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (640.35µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.902282 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (642.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.930339 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (691.352µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.958202 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (671.018µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:13.987451 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (868.023µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.015578 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (694.352µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.043575 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (630.35µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.077501 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (839.022µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.106273 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (744.687µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.135753 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (724.686µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.163724 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (658.351µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.192433 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (793.688µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.220686 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.020694ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.251405 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (662.018µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.279289 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (731.352µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.306953 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (642.017µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.335288 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (675.351µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.363333 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (777.021µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.392057 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (716.686µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.419868 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (711.352µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.447871 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (573.015µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.476101 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (646.351µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.503862 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (697.352µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.541200 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (828.689µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.569249 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (706.685µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.597654 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (740.353µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.627261 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (680.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.655128 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (750.353µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.683120 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (710.353µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.692385 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.692574 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.710915 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (626.683µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.738836 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (704.685µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.767083 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (694.685µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.795158 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (900.024µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.822822 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (598.683µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.850666 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (619.683µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.878893 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (667.351µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.906713 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (587.682µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.940840 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (6.853515ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.969097 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (708.353µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:14.997989 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (667.018µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.026030 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (674.351µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.063507 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.729379ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.108304 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.337702ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.142857 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.938718ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.171864 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (652.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.203176 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (2.322062ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.233169 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (773.688µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.261925 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (640.351µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.290578 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (656.684µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.320580 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (906.69µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.348900 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (863.023µs) from=127.0.0.1:37900
TestExecCommand_Sessions - 2019/12/30 19:04:15.355219 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_Sessions - 2019/12/30 19:04:15.355752 [DEBUG] consul: Skipping self join check for "Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4" since the cluster is too small
TestExecCommand_Sessions - 2019/12/30 19:04:15.355940 [INFO] consul: member 'Node 2a561ca5-d95e-ed50-2bea-52289c7ae1d4' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.377442 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (863.69µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.405850 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (779.354µs) from=127.0.0.1:37900
TestExecCommand_UploadDestroy - 2019/12/30 19:04:15.429167 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.430636 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.431103 [DEBUG] consul: Skipping self join check for "Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa" since the cluster is too small
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.431262 [INFO] consul: member 'Node e4e664e1-99cc-3b64-ecb5-315e7727c9aa' joined, marking health alive
TestExecCommand_StreamResults - 2019/12/30 19:04:15.435260 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_StreamResults - 2019/12/30 19:04:15.435703 [DEBUG] consul: Skipping self join check for "Node 0c530816-fb91-65af-16f8-9ba13c6ddcae" since the cluster is too small
TestExecCommand_StreamResults - 2019/12/30 19:04:15.435852 [INFO] consul: member 'Node 0c530816-fb91-65af-16f8-9ba13c6ddcae' joined, marking health alive
TestExecCommand_UploadDestroy - 2019/12/30 19:04:15.436477 [DEBUG] consul: Skipping self join check for "Node b0343907-6f83-2c03-e458-e322a60af5c4" since the cluster is too small
TestExecCommand_UploadDestroy - 2019/12/30 19:04:15.436650 [INFO] consul: member 'Node b0343907-6f83-2c03-e458-e322a60af5c4' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.439458 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.077029ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.468090 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (832.022µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.496551 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (910.358µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.527360 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (864.356µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.561106 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (767.354µs) from=127.0.0.1:37900
TestExecCommand_UploadDestroy - 2019/12/30 19:04:15.564832 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.589677 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.016361ms) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.636083 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (686.352µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.664528 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (702.018µs) from=127.0.0.1:37900
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.711271 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (3.093082ms) from=127.0.0.1:37900
TestExecCommand_UploadDestroy - 2019/12/30 19:04:15.737505 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_StreamResults - 2019/12/30 19:04:15.738416 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.745891 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:15.757242 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.638043ms) from=127.0.0.1:37900
TestExecCommand_Sessions - 2019/12/30 19:04:15.838908 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions - 2019/12/30 19:04:15.841898 [DEBUG] http: Request PUT /v1/session/create (283.05252ms) from=127.0.0.1:50754
TestExecCommand_Sessions - 2019/12/30 19:04:15.850815 [DEBUG] http: Request GET /v1/session/info/e7e68519-9083-e55e-0597-176fdbe267a8 (1.608042ms) from=127.0.0.1:50760
TestExecCommand_Sessions - 2019/12/30 19:04:15.932819 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_Sessions - 2019/12/30 19:04:15.932908 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.012874 [DEBUG] http: Request PUT /v1/session/create (245.602859ms) from=127.0.0.1:37900
TestExecCommand_Sessions - 2019/12/30 19:04:16.014021 [DEBUG] http: Request PUT /v1/session/destroy/e7e68519-9083-e55e-0597-176fdbe267a8 (160.566599ms) from=127.0.0.1:50754
TestExecCommand_StreamResults - 2019/12/30 19:04:16.016521 [DEBUG] http: Request PUT /v1/session/create (257.333171ms) from=127.0.0.1:58462
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.021937 [DEBUG] http: Request PUT /v1/session/create (261.543282ms) from=127.0.0.1:60324
TestExecCommand_Sessions - 2019/12/30 19:04:16.022658 [DEBUG] http: Request GET /v1/session/info/e7e68519-9083-e55e-0597-176fdbe267a8 (1.683711ms) from=127.0.0.1:50764
TestExecCommand_Sessions - 2019/12/30 19:04:16.023827 [INFO] agent: Requesting shutdown
TestExecCommand_Sessions - 2019/12/30 19:04:16.023926 [INFO] consul: shutting down server
TestExecCommand_Sessions - 2019/12/30 19:04:16.023972 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/12/30 19:04:16.029787 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?keys=&wait=2000ms (437.678µs) from=127.0.0.1:58462
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.019365 [DEBUG] http: Request GET /v1/session/info/ac1002d5-6ea1-3481-68c5-ff70b68dac0c (1.204032ms) from=127.0.0.1:37910
TestExecCommand_Sessions - 2019/12/30 19:04:16.155471 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/12/30 19:04:16.238457 [DEBUG] http: Request PUT /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/ack?acquire=3f81541f-024f-b3d9-52a0-777160b6557c (207.140504ms) from=127.0.0.1:58472
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.238563 [DEBUG] http: Request PUT /v1/session/destroy/ac1002d5-6ea1-3481-68c5-ff70b68dac0c (204.182091ms) from=127.0.0.1:37900
TestExecCommand_StreamResults - 2019/12/30 19:04:16.240219 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?index=1&keys=&wait=2000ms (203.917084ms) from=127.0.0.1:58462
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.241524 [DEBUG] http: Request PUT /v1/kv/_rexec/24a6f5db-15ad-7708-30e5-816e6787fbf1/job?acquire=24a6f5db-15ad-7708-30e5-816e6787fbf1 (214.5167ms) from=127.0.0.1:60324
TestExecCommand_Sessions - 2019/12/30 19:04:16.246357 [INFO] manager: shutting down
TestExecCommand_Sessions - 2019/12/30 19:04:16.246990 [INFO] agent: consul server down
TestExecCommand_Sessions - 2019/12/30 19:04:16.247049 [INFO] agent: shutdown complete
TestExecCommand_Sessions - 2019/12/30 19:04:16.247099 [INFO] agent: Stopping DNS server 127.0.0.1:23519 (tcp)
TestExecCommand_Sessions - 2019/12/30 19:04:16.247229 [INFO] agent: Stopping DNS server 127.0.0.1:23519 (udp)
TestExecCommand_Sessions - 2019/12/30 19:04:16.247379 [INFO] agent: Stopping HTTP server 127.0.0.1:23520 (tcp)
TestExecCommand_Sessions - 2019/12/30 19:04:16.248214 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_Sessions - 2019/12/30 19:04:16.248698 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_Sessions (6.96s)
=== CONT  TestExecCommand_NoShell
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.250896 [DEBUG] http: Request GET /v1/session/info/ac1002d5-6ea1-3481-68c5-ff70b68dac0c (7.007186ms) from=127.0.0.1:37916
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.252089 [INFO] agent: Requesting shutdown
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.252183 [INFO] consul: shutting down server
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.252234 [WARN] serf: Shutdown without a Leave
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.257498 [DEBUG] http: Request GET /v1/kv/_rexec/24a6f5db-15ad-7708-30e5-816e6787fbf1/job (13.18035ms) from=127.0.0.1:60336
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_NoShell - 2019/12/30 19:04:16.318290 [WARN] agent: Node name "Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_NoShell - 2019/12/30 19:04:16.318674 [DEBUG] tlsutil: Update with version 1
TestExecCommand_NoShell - 2019/12/30 19:04:16.320972 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.528261 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.653688 [INFO] manager: shutting down
TestExecCommand_StreamResults - 2019/12/30 19:04:16.655298 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?index=12&keys=&wait=2000ms (411.248593ms) from=127.0.0.1:58462
TestExecCommand_StreamResults - 2019/12/30 19:04:16.656093 [DEBUG] http: Request PUT /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/exit?acquire=3f81541f-024f-b3d9-52a0-777160b6557c (410.109563ms) from=127.0.0.1:58478
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.656315 [INFO] agent: consul server down
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.656415 [INFO] agent: shutdown complete
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.656531 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.656936 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.657267 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestExecCommand_StreamResults - 2019/12/30 19:04:16.658843 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/exit (1.054361ms) from=127.0.0.1:58462
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.660246 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_Sessions_Foreign - 2019/12/30 19:04:16.660407 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_Sessions_Foreign (7.38s)
=== CONT  TestExecCommand
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.665235 [DEBUG] http: Request DELETE /v1/kv/_rexec/24a6f5db-15ad-7708-30e5-816e6787fbf1?recurse= (404.97376ms) from=127.0.0.1:60324
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.671307 [DEBUG] http: Request GET /v1/kv/_rexec/24a6f5db-15ad-7708-30e5-816e6787fbf1/job (425.344µs) from=127.0.0.1:60340
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.672044 [INFO] agent: Requesting shutdown
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.672116 [INFO] consul: shutting down server
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.672162 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand - 2019/12/30 19:04:16.731229 [WARN] agent: Node name "Node 2ddb2c92-168c-f439-da95-d4780f3b37a4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand - 2019/12/30 19:04:16.731739 [DEBUG] tlsutil: Update with version 1
TestExecCommand - 2019/12/30 19:04:16.734014 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_UploadDestroy - 2019/12/30 19:04:16.986809 [WARN] serf: Shutdown without a Leave
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.153530 [INFO] manager: shutting down
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.154347 [INFO] agent: consul server down
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.154477 [INFO] agent: shutdown complete
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.154537 [INFO] agent: Stopping DNS server 127.0.0.1:23513 (tcp)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.154681 [INFO] agent: Stopping DNS server 127.0.0.1:23513 (udp)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.154840 [INFO] agent: Stopping HTTP server 127.0.0.1:23514 (tcp)
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.155559 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_UploadDestroy - 2019/12/30 19:04:17.155687 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_UploadDestroy (7.87s)
TestExecCommand_StreamResults - 2019/12/30 19:04:17.156700 [DEBUG] http: Request PUT /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/random?acquire=3f81541f-024f-b3d9-52a0-777160b6557c (481.950138ms) from=127.0.0.1:58482
TestExecCommand_StreamResults - 2019/12/30 19:04:17.157641 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?index=13&keys=&wait=2000ms (485.712572ms) from=127.0.0.1:58462
TestExecCommand_StreamResults - 2019/12/30 19:04:17.471806 [DEBUG] http: Request PUT /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/out/00000?acquire=3f81541f-024f-b3d9-52a0-777160b6557c (311.375272ms) from=127.0.0.1:58484
TestExecCommand_StreamResults - 2019/12/30 19:04:17.473661 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?index=14&keys=&wait=2000ms (311.030596ms) from=127.0.0.1:58462
TestExecCommand_StreamResults - 2019/12/30 19:04:17.479272 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/out/00000 (2.027387ms) from=127.0.0.1:58462
2019/12/30 19:04:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:70e8e0d1-6b02-0361-cc4b-67691aa05c90 Address:127.0.0.1:23530}]
2019/12/30 19:04:17 [INFO]  raft: Node at 127.0.0.1:23530 [Follower] entering Follower state (Leader: "")
TestExecCommand_StreamResults - 2019/12/30 19:04:17.724491 [DEBUG] http: Request PUT /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/out/00001?acquire=3f81541f-024f-b3d9-52a0-777160b6557c (239.741036ms) from=127.0.0.1:58486
TestExecCommand_StreamResults - 2019/12/30 19:04:17.726023 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?index=15&keys=&wait=2000ms (236.01427ms) from=127.0.0.1:58462
TestExecCommand_NoShell - 2019/12/30 19:04:17.729240 [INFO] serf: EventMemberJoin: Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90.dc1 127.0.0.1
TestExecCommand_StreamResults - 2019/12/30 19:04:17.732262 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/foo/out/00001 (771.354µs) from=127.0.0.1:58462
TestExecCommand_NoShell - 2019/12/30 19:04:17.733527 [INFO] serf: EventMemberJoin: Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90 127.0.0.1
TestExecCommand_NoShell - 2019/12/30 19:04:17.734780 [INFO] agent: Started DNS server 127.0.0.1:23525 (udp)
TestExecCommand_StreamResults - 2019/12/30 19:04:17.735102 [INFO] agent: Requesting shutdown
TestExecCommand_NoShell - 2019/12/30 19:04:17.735225 [INFO] consul: Adding LAN server Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90 (Addr: tcp/127.0.0.1:23530) (DC: dc1)
TestExecCommand_StreamResults - 2019/12/30 19:04:17.735251 [INFO] consul: shutting down server
TestExecCommand_StreamResults - 2019/12/30 19:04:17.735508 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/12/30 19:04:17.735415 [INFO] consul: Handled member-join event for server "Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90.dc1" in area "wan"
TestExecCommand_NoShell - 2019/12/30 19:04:17.735883 [INFO] agent: Started DNS server 127.0.0.1:23525 (tcp)
TestExecCommand_NoShell - 2019/12/30 19:04:17.738559 [INFO] agent: Started HTTP server on 127.0.0.1:23526 (tcp)
TestExecCommand_NoShell - 2019/12/30 19:04:17.738683 [INFO] agent: started state syncer
2019/12/30 19:04:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:17 [INFO]  raft: Node at 127.0.0.1:23530 [Candidate] entering Candidate state in term 2
TestExecCommand_StreamResults - 2019/12/30 19:04:17.870289 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/12/30 19:04:17.978682 [INFO] manager: shutting down
TestExecCommand_StreamResults - 2019/12/30 19:04:17.979759 [INFO] agent: consul server down
TestExecCommand_StreamResults - 2019/12/30 19:04:17.979854 [INFO] agent: shutdown complete
TestExecCommand_StreamResults - 2019/12/30 19:04:17.979967 [INFO] agent: Stopping DNS server 127.0.0.1:23507 (tcp)
TestExecCommand_StreamResults - 2019/12/30 19:04:17.980171 [INFO] agent: Stopping DNS server 127.0.0.1:23507 (udp)
TestExecCommand_StreamResults - 2019/12/30 19:04:17.980369 [INFO] agent: Stopping HTTP server 127.0.0.1:23508 (tcp)
2019/12/30 19:04:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2ddb2c92-168c-f439-da95-d4780f3b37a4 Address:127.0.0.1:23536}]
2019/12/30 19:04:18 [INFO]  raft: Node at 127.0.0.1:23536 [Follower] entering Follower state (Leader: "")
2019/12/30 19:04:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:04:18 [INFO]  raft: Node at 127.0.0.1:23536 [Candidate] entering Candidate state in term 2
TestExecCommand - 2019/12/30 19:04:18.216682 [INFO] serf: EventMemberJoin: Node 2ddb2c92-168c-f439-da95-d4780f3b37a4.dc1 127.0.0.1
TestExecCommand - 2019/12/30 19:04:18.220960 [INFO] serf: EventMemberJoin: Node 2ddb2c92-168c-f439-da95-d4780f3b37a4 127.0.0.1
TestExecCommand - 2019/12/30 19:04:18.222383 [INFO] agent: Started DNS server 127.0.0.1:23531 (udp)
TestExecCommand - 2019/12/30 19:04:18.225179 [INFO] consul: Handled member-join event for server "Node 2ddb2c92-168c-f439-da95-d4780f3b37a4.dc1" in area "wan"
TestExecCommand - 2019/12/30 19:04:18.225935 [INFO] consul: Adding LAN server Node 2ddb2c92-168c-f439-da95-d4780f3b37a4 (Addr: tcp/127.0.0.1:23536) (DC: dc1)
TestExecCommand - 2019/12/30 19:04:18.226094 [INFO] agent: Started DNS server 127.0.0.1:23531 (tcp)
TestExecCommand - 2019/12/30 19:04:18.233657 [INFO] agent: Started HTTP server on 127.0.0.1:23532 (tcp)
TestExecCommand - 2019/12/30 19:04:18.233818 [INFO] agent: started state syncer
2019/12/30 19:04:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:18 [INFO]  raft: Node at 127.0.0.1:23530 [Leader] entering Leader state
TestExecCommand_NoShell - 2019/12/30 19:04:18.497641 [INFO] consul: New leader elected: Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90
TestExecCommand_NoShell - 2019/12/30 19:04:18.497873 [INFO] consul: cluster leadership acquired
TestExecCommand_NoShell - 2019/12/30 19:04:18.871051 [INFO] agent: Synced node info
TestExecCommand_NoShell - 2019/12/30 19:04:18.871169 [DEBUG] agent: Node info in sync
2019/12/30 19:04:18 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:04:18 [INFO]  raft: Node at 127.0.0.1:23536 [Leader] entering Leader state
TestExecCommand - 2019/12/30 19:04:18.971695 [INFO] consul: cluster leadership acquired
TestExecCommand - 2019/12/30 19:04:18.972174 [INFO] consul: New leader elected: Node 2ddb2c92-168c-f439-da95-d4780f3b37a4
TestExecCommand_StreamResults - 2019/12/30 19:04:18.980688 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:23508 (tcp)
TestExecCommand_StreamResults - 2019/12/30 19:04:18.980771 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_StreamResults - 2019/12/30 19:04:18.980808 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_StreamResults (9.70s)
TestExecCommand_NoShell - 2019/12/30 19:04:19.247885 [DEBUG] agent: Node info in sync
TestExecCommand - 2019/12/30 19:04:19.297136 [INFO] agent: Synced node info
TestExecCommand - 2019/12/30 19:04:19.297241 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/12/30 19:04:19.805419 [DEBUG] http: Request GET /v1/kv/_rexec/3f81541f-024f-b3d9-52a0-777160b6557c/?index=16&keys=&wait=2000ms (2.07012033s) from=127.0.0.1:58462
TestExecCommand_NoShell - 2019/12/30 19:04:20.009788 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_NoShell - 2019/12/30 19:04:20.010961 [DEBUG] consul: Skipping self join check for "Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90" since the cluster is too small
TestExecCommand_NoShell - 2019/12/30 19:04:20.011451 [INFO] consul: member 'Node 70e8e0d1-6b02-0361-cc4b-67691aa05c90' joined, marking health alive
TestExecCommand_NoShell - 2019/12/30 19:04:20.214721 [DEBUG] http: Request GET /v1/agent/self (6.757513ms) from=127.0.0.1:55274
TestExecCommand_NoShell - 2019/12/30 19:04:20.538780 [DEBUG] http: Request PUT /v1/session/create (312.974648ms) from=127.0.0.1:55274
TestExecCommand - 2019/12/30 19:04:20.721052 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand - 2019/12/30 19:04:20.721620 [DEBUG] consul: Skipping self join check for "Node 2ddb2c92-168c-f439-da95-d4780f3b37a4" since the cluster is too small
TestExecCommand - 2019/12/30 19:04:20.721805 [INFO] consul: member 'Node 2ddb2c92-168c-f439-da95-d4780f3b37a4' joined, marking health alive
TestExecCommand_NoShell - 2019/12/30 19:04:20.824589 [DEBUG] http: Request PUT /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/job?acquire=78cefd37-bbc5-779b-a84e-91d6fad870d8 (280.400782ms) from=127.0.0.1:55274
TestExecCommand - 2019/12/30 19:04:20.948663 [DEBUG] http: Request GET /v1/agent/self (18.579494ms) from=127.0.0.1:34094
TestExecCommand_NoShell - 2019/12/30 19:04:21.028709 [DEBUG] consul: User event: _rexec
TestExecCommand_NoShell - 2019/12/30 19:04:21.028947 [DEBUG] agent: received remote exec event (ID: 010576f3-8261-385e-2dd4-593f6bc36fc7)
TestExecCommand_NoShell - 2019/12/30 19:04:21.029818 [DEBUG] http: Request PUT /v1/event/fire/_rexec (1.820382ms) from=127.0.0.1:55274
TestExecCommand_NoShell - 2019/12/30 19:04:21.033316 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/?keys=&wait=1000ms (621.35µs) from=127.0.0.1:55274
TestExecCommand_NoShell - 2019/12/30 19:04:21.381821 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_NoShell - 2019/12/30 19:04:21.382113 [INFO] agent: remote exec ''
TestExecCommand_NoShell - 2019/12/30 19:04:21.383544 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/?index=12&keys=&wait=1000ms (348.320253ms) from=127.0.0.1:55274
TestExecCommand - 2019/12/30 19:04:21.388789 [DEBUG] http: Request PUT /v1/session/create (423.067572ms) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:21.722812 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand - 2019/12/30 19:04:21.724524 [DEBUG] http: Request PUT /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/job?acquire=9d177087-3e65-278e-7136-4f7fd7c6119a (327.24136ms) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:21.928140 [DEBUG] consul: User event: _rexec
TestExecCommand - 2019/12/30 19:04:21.928618 [DEBUG] agent: received remote exec event (ID: 4cb94ab7-3231-2327-d9cd-2adb18a7f604)
TestExecCommand - 2019/12/30 19:04:21.928803 [DEBUG] http: Request PUT /v1/event/fire/_rexec (1.265701ms) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:21.932210 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/?keys=&wait=1000ms (717.019µs) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:22.154454 [INFO] agent: remote exec 'uptime'
TestExecCommand - 2019/12/30 19:04:22.166712 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/?index=12&keys=&wait=1000ms (232.066498ms) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:22.245039 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand - 2019/12/30 19:04:22.245135 [DEBUG] agent: Node info in sync
TestExecCommand_NoShell - 2019/12/30 19:04:22.396900 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/?index=13&keys=&wait=1000ms (1.009457482s) from=127.0.0.1:55274
TestExecCommand_NoShell - 2019/12/30 19:04:22.407773 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/?index=13&keys=&wait=1000ms (8.449225ms) from=127.0.0.1:55274
TestExecCommand_NoShell - 2019/12/30 19:04:22.416833 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/Node%2070e8e0d1-6b02-0361-cc4b-67691aa05c90/out/00000 (1.224032ms) from=127.0.0.1:55274
TestExecCommand - 2019/12/30 19:04:22.507339 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/?index=13&keys=&wait=1000ms (334.222879ms) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:22.525563 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/Node%202ddb2c92-168c-f439-da95-d4780f3b37a4/out/00000 (6.90185ms) from=127.0.0.1:34094
TestExecCommand_NoShell - 2019/12/30 19:04:22.677258 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/?index=14&keys=&wait=1000ms (257.522508ms) from=127.0.0.1:55274
TestExecCommand_NoShell - 2019/12/30 19:04:22.680619 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/Node%2070e8e0d1-6b02-0361-cc4b-67691aa05c90/exit (748.019µs) from=127.0.0.1:55274
TestExecCommand - 2019/12/30 19:04:22.749370 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/?index=14&keys=&wait=1000ms (212.006965ms) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:22.755139 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/Node%202ddb2c92-168c-f439-da95-d4780f3b37a4/exit (1.104363ms) from=127.0.0.1:34094
TestExecCommand_NoShell - 2019/12/30 19:04:23.716049 [DEBUG] http: Request GET /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8/?index=15&keys=&wait=1000ms (1.032036414s) from=127.0.0.1:55274
TestExecCommand - 2019/12/30 19:04:23.785385 [DEBUG] http: Request GET /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a/?index=15&keys=&wait=1000ms (1.027785301s) from=127.0.0.1:34094
TestExecCommand - 2019/12/30 19:04:23.998231 [DEBUG] http: Request PUT /v1/session/destroy/9d177087-3e65-278e-7136-4f7fd7c6119a (239.19402ms) from=127.0.0.1:34104
TestExecCommand_NoShell - 2019/12/30 19:04:24.008630 [DEBUG] http: Request PUT /v1/session/destroy/78cefd37-bbc5-779b-a84e-91d6fad870d8 (322.771908ms) from=127.0.0.1:55292
TestExecCommand - 2019/12/30 19:04:24.415834 [DEBUG] http: Request DELETE /v1/kv/_rexec/9d177087-3e65-278e-7136-4f7fd7c6119a?recurse= (414.86302ms) from=127.0.0.1:34104
TestExecCommand_NoShell - 2019/12/30 19:04:24.415834 [DEBUG] http: Request DELETE /v1/kv/_rexec/78cefd37-bbc5-779b-a84e-91d6fad870d8?recurse= (404.536746ms) from=127.0.0.1:55292
TestExecCommand - 2019/12/30 19:04:24.638876 [DEBUG] http: Request PUT /v1/session/destroy/9d177087-3e65-278e-7136-4f7fd7c6119a (220.377854ms) from=127.0.0.1:34104
TestExecCommand_NoShell - 2019/12/30 19:04:24.639212 [DEBUG] http: Request PUT /v1/session/destroy/78cefd37-bbc5-779b-a84e-91d6fad870d8 (219.750504ms) from=127.0.0.1:55292
TestExecCommand_NoShell - 2019/12/30 19:04:24.642236 [INFO] agent: Requesting shutdown
TestExecCommand_NoShell - 2019/12/30 19:04:24.642330 [INFO] consul: shutting down server
TestExecCommand_NoShell - 2019/12/30 19:04:24.642394 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/12/30 19:04:24.644280 [INFO] agent: Requesting shutdown
TestExecCommand - 2019/12/30 19:04:24.644425 [INFO] consul: shutting down server
TestExecCommand - 2019/12/30 19:04:24.644504 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/12/30 19:04:24.712003 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/12/30 19:04:24.716636 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/12/30 19:04:24.786969 [INFO] manager: shutting down
TestExecCommand_NoShell - 2019/12/30 19:04:24.787566 [INFO] agent: consul server down
TestExecCommand_NoShell - 2019/12/30 19:04:24.787621 [INFO] agent: shutdown complete
TestExecCommand_NoShell - 2019/12/30 19:04:24.787675 [INFO] agent: Stopping DNS server 127.0.0.1:23525 (tcp)
TestExecCommand_NoShell - 2019/12/30 19:04:24.787810 [INFO] agent: Stopping DNS server 127.0.0.1:23525 (udp)
TestExecCommand_NoShell - 2019/12/30 19:04:24.787951 [INFO] agent: Stopping HTTP server 127.0.0.1:23526 (tcp)
TestExecCommand_NoShell - 2019/12/30 19:04:24.788584 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_NoShell - 2019/12/30 19:04:24.788727 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_NoShell (8.54s)
TestExecCommand - 2019/12/30 19:04:24.845427 [INFO] manager: shutting down
TestExecCommand - 2019/12/30 19:04:24.846403 [INFO] agent: consul server down
TestExecCommand - 2019/12/30 19:04:24.846471 [INFO] agent: shutdown complete
TestExecCommand - 2019/12/30 19:04:24.846557 [INFO] agent: Stopping DNS server 127.0.0.1:23531 (tcp)
TestExecCommand - 2019/12/30 19:04:24.846756 [INFO] agent: Stopping DNS server 127.0.0.1:23531 (udp)
TestExecCommand - 2019/12/30 19:04:24.846934 [INFO] agent: Stopping HTTP server 127.0.0.1:23532 (tcp)
TestExecCommand - 2019/12/30 19:04:24.847675 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand - 2019/12/30 19:04:24.847844 [INFO] agent: Endpoints down
--- PASS: TestExecCommand (8.19s)
PASS
ok  	github.com/hashicorp/consul/command/exec	15.815s
=== RUN   TestConfigUtil_Values
=== PAUSE TestConfigUtil_Values
=== RUN   TestConfigUtil_Visit
=== PAUSE TestConfigUtil_Visit
=== RUN   TestFlagMapValueSet
=== PAUSE TestFlagMapValueSet
=== RUN   TestAppendSliceValue_implements
=== PAUSE TestAppendSliceValue_implements
=== RUN   TestAppendSliceValueSet
=== PAUSE TestAppendSliceValueSet
=== RUN   TestHTTPFlagsSetToken
--- PASS: TestHTTPFlagsSetToken (0.00s)
=== CONT  TestConfigUtil_Values
=== CONT  TestFlagMapValueSet
=== RUN   TestFlagMapValueSet/missing_=
=== RUN   TestFlagMapValueSet/sets
=== RUN   TestFlagMapValueSet/sets_multiple
=== CONT  TestAppendSliceValue_implements
--- PASS: TestAppendSliceValue_implements (0.00s)
=== CONT  TestConfigUtil_Visit
=== CONT  TestAppendSliceValueSet
=== RUN   TestFlagMapValueSet/overwrites
--- PASS: TestAppendSliceValueSet (0.00s)
--- PASS: TestFlagMapValueSet (0.00s)
    --- PASS: TestFlagMapValueSet/missing_= (0.00s)
    --- PASS: TestFlagMapValueSet/sets (0.00s)
    --- PASS: TestFlagMapValueSet/sets_multiple (0.00s)
    --- PASS: TestFlagMapValueSet/overwrites (0.00s)
--- PASS: TestConfigUtil_Values (0.00s)
--- PASS: TestConfigUtil_Visit (0.08s)
PASS
ok  	github.com/hashicorp/consul/command/flags	0.121s
=== RUN   TestForceLeaveCommand_noTabs
=== PAUSE TestForceLeaveCommand_noTabs
=== RUN   TestForceLeaveCommand
=== PAUSE TestForceLeaveCommand
=== RUN   TestForceLeaveCommand_noAddrs
=== PAUSE TestForceLeaveCommand_noAddrs
=== CONT  TestForceLeaveCommand_noTabs
=== CONT  TestForceLeaveCommand_noAddrs
=== CONT  TestForceLeaveCommand
--- PASS: TestForceLeaveCommand_noTabs (0.00s)
--- PASS: TestForceLeaveCommand_noAddrs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestForceLeaveCommand - 2019/12/30 19:05:25.778543 [WARN] agent: Node name "Node d3cff437-ebd3-9123-67ba-84289bc8af13" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestForceLeaveCommand - 2019/12/30 19:05:26.698094 [DEBUG] tlsutil: Update with version 1
TestForceLeaveCommand - 2019/12/30 19:05:26.705912 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:05:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d3cff437-ebd3-9123-67ba-84289bc8af13 Address:127.0.0.1:44506}]
2019/12/30 19:05:28 [INFO]  raft: Node at 127.0.0.1:44506 [Follower] entering Follower state (Leader: "")
TestForceLeaveCommand - 2019/12/30 19:05:28.603171 [INFO] serf: EventMemberJoin: Node d3cff437-ebd3-9123-67ba-84289bc8af13.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:28.608656 [INFO] serf: EventMemberJoin: Node d3cff437-ebd3-9123-67ba-84289bc8af13 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:28.612985 [INFO] consul: Handled member-join event for server "Node d3cff437-ebd3-9123-67ba-84289bc8af13.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/30 19:05:28.613039 [INFO] consul: Adding LAN server Node d3cff437-ebd3-9123-67ba-84289bc8af13 (Addr: tcp/127.0.0.1:44506) (DC: dc1)
TestForceLeaveCommand - 2019/12/30 19:05:28.613336 [INFO] agent: Started DNS server 127.0.0.1:44501 (udp)
TestForceLeaveCommand - 2019/12/30 19:05:28.613399 [INFO] agent: Started DNS server 127.0.0.1:44501 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:28.616433 [INFO] agent: Started HTTP server on 127.0.0.1:44502 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:28.616707 [INFO] agent: started state syncer
2019/12/30 19:05:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:05:28 [INFO]  raft: Node at 127.0.0.1:44506 [Candidate] entering Candidate state in term 2
2019/12/30 19:05:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:05:29 [INFO]  raft: Node at 127.0.0.1:44506 [Leader] entering Leader state
TestForceLeaveCommand - 2019/12/30 19:05:29.505578 [INFO] consul: cluster leadership acquired
TestForceLeaveCommand - 2019/12/30 19:05:29.506256 [INFO] consul: New leader elected: Node d3cff437-ebd3-9123-67ba-84289bc8af13
TestForceLeaveCommand - 2019/12/30 19:05:30.231228 [INFO] agent: Synced node info
TestForceLeaveCommand - 2019/12/30 19:05:30.231340 [DEBUG] agent: Node info in sync
TestForceLeaveCommand - 2019/12/30 19:05:31.273064 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestForceLeaveCommand - 2019/12/30 19:05:31.420823 [WARN] agent: Node name "Node db2319e5-a254-0b27-19a5-503d0f068b04" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestForceLeaveCommand - 2019/12/30 19:05:31.421361 [DEBUG] tlsutil: Update with version 1
TestForceLeaveCommand - 2019/12/30 19:05:31.423772 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/30 19:05:31.945438 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestForceLeaveCommand - 2019/12/30 19:05:31.946676 [DEBUG] consul: Skipping self join check for "Node d3cff437-ebd3-9123-67ba-84289bc8af13" since the cluster is too small
TestForceLeaveCommand - 2019/12/30 19:05:31.946864 [INFO] consul: member 'Node d3cff437-ebd3-9123-67ba-84289bc8af13' joined, marking health alive
TestForceLeaveCommand - 2019/12/30 19:05:32.481907 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:05:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:db2319e5-a254-0b27-19a5-503d0f068b04 Address:127.0.0.1:44512}]
2019/12/30 19:05:34 [INFO]  raft: Node at 127.0.0.1:44512 [Follower] entering Follower state (Leader: "")
TestForceLeaveCommand - 2019/12/30 19:05:34.205563 [INFO] serf: EventMemberJoin: Node db2319e5-a254-0b27-19a5-503d0f068b04.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:34.228681 [INFO] serf: EventMemberJoin: Node db2319e5-a254-0b27-19a5-503d0f068b04 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:34.235420 [INFO] consul: Adding LAN server Node db2319e5-a254-0b27-19a5-503d0f068b04 (Addr: tcp/127.0.0.1:44512) (DC: dc1)
TestForceLeaveCommand - 2019/12/30 19:05:34.235795 [INFO] consul: Handled member-join event for server "Node db2319e5-a254-0b27-19a5-503d0f068b04.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/30 19:05:34.238941 [INFO] agent: Started DNS server 127.0.0.1:44507 (udp)
TestForceLeaveCommand - 2019/12/30 19:05:34.239317 [INFO] agent: Started DNS server 127.0.0.1:44507 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:34.248118 [INFO] agent: Started HTTP server on 127.0.0.1:44508 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:34.248253 [INFO] agent: started state syncer
2019/12/30 19:05:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:05:34 [INFO]  raft: Node at 127.0.0.1:44512 [Candidate] entering Candidate state in term 2
2019/12/30 19:05:37 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:05:37 [INFO]  raft: Node at 127.0.0.1:44512 [Leader] entering Leader state
TestForceLeaveCommand - 2019/12/30 19:05:37.297162 [INFO] consul: cluster leadership acquired
TestForceLeaveCommand - 2019/12/30 19:05:37.297678 [INFO] consul: New leader elected: Node db2319e5-a254-0b27-19a5-503d0f068b04
TestForceLeaveCommand - 2019/12/30 19:05:37.482353 [INFO] agent: (LAN) joining: [127.0.0.1:44504]
TestForceLeaveCommand - 2019/12/30 19:05:37.483043 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:44504
TestForceLeaveCommand - 2019/12/30 19:05:37.483117 [DEBUG] memberlist: Stream connection from=127.0.0.1:60018
TestForceLeaveCommand - 2019/12/30 19:05:37.492771 [INFO] serf: EventMemberJoin: Node db2319e5-a254-0b27-19a5-503d0f068b04 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:37.493724 [INFO] consul: Adding LAN server Node db2319e5-a254-0b27-19a5-503d0f068b04 (Addr: tcp/127.0.0.1:44512) (DC: dc1)
TestForceLeaveCommand - 2019/12/30 19:05:37.493825 [INFO] consul: New leader elected: Node db2319e5-a254-0b27-19a5-503d0f068b04
TestForceLeaveCommand - 2019/12/30 19:05:37.494493 [INFO] serf: EventMemberJoin: Node d3cff437-ebd3-9123-67ba-84289bc8af13 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:37.495239 [INFO] agent: (LAN) joined: 1
TestForceLeaveCommand - 2019/12/30 19:05:37.495322 [DEBUG] agent: systemd notify failed: No socket
TestForceLeaveCommand - 2019/12/30 19:05:37.495366 [INFO] agent: Requesting shutdown
TestForceLeaveCommand - 2019/12/30 19:05:37.495419 [INFO] consul: shutting down server
TestForceLeaveCommand - 2019/12/30 19:05:37.495475 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/30 19:05:37.496495 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:44511
TestForceLeaveCommand - 2019/12/30 19:05:37.497858 [ERR] consul: 'Node db2319e5-a254-0b27-19a5-503d0f068b04' and 'Node d3cff437-ebd3-9123-67ba-84289bc8af13' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestForceLeaveCommand - 2019/12/30 19:05:37.497977 [INFO] consul: member 'Node db2319e5-a254-0b27-19a5-503d0f068b04' joined, marking health alive
TestForceLeaveCommand - 2019/12/30 19:05:37.499904 [DEBUG] memberlist: Stream connection from=127.0.0.1:48382
TestForceLeaveCommand - 2019/12/30 19:05:37.503947 [INFO] serf: EventMemberJoin: Node d3cff437-ebd3-9123-67ba-84289bc8af13.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:37.504760 [INFO] consul: Handled member-join event for server "Node d3cff437-ebd3-9123-67ba-84289bc8af13.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/30 19:05:37.507632 [INFO] serf: EventMemberJoin: Node db2319e5-a254-0b27-19a5-503d0f068b04.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:37.508387 [DEBUG] consul: Successfully performed flood-join for "Node db2319e5-a254-0b27-19a5-503d0f068b04" at 127.0.0.1:44511
TestForceLeaveCommand - 2019/12/30 19:05:37.508790 [INFO] consul: Handled member-join event for server "Node db2319e5-a254-0b27-19a5-503d0f068b04.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/30 19:05:37.605776 [DEBUG] serf: messageJoinType: Node d3cff437-ebd3-9123-67ba-84289bc8af13.dc1
TestForceLeaveCommand - 2019/12/30 19:05:37.709107 [DEBUG] serf: messageJoinType: Node d3cff437-ebd3-9123-67ba-84289bc8af13.dc1
TestForceLeaveCommand - 2019/12/30 19:05:37.746994 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/30 19:05:37.941825 [INFO] agent: Synced node info
TestForceLeaveCommand - 2019/12/30 19:05:37.942195 [INFO] manager: shutting down
TestForceLeaveCommand - 2019/12/30 19:05:37.952232 [ERR] agent: failed to sync remote state: No cluster leader
TestForceLeaveCommand - 2019/12/30 19:05:38.110410 [DEBUG] memberlist: Failed ping: Node db2319e5-a254-0b27-19a5-503d0f068b04 (timeout reached)
TestForceLeaveCommand - 2019/12/30 19:05:38.230631 [INFO] agent: consul server down
TestForceLeaveCommand - 2019/12/30 19:05:38.230718 [INFO] agent: shutdown complete
TestForceLeaveCommand - 2019/12/30 19:05:38.230779 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:38.230944 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (udp)
TestForceLeaveCommand - 2019/12/30 19:05:38.231112 [INFO] agent: Stopping HTTP server 127.0.0.1:44508 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:38.231564 [INFO] agent: Waiting for endpoints to shut down
TestForceLeaveCommand - 2019/12/30 19:05:38.231637 [INFO] agent: Endpoints down
TestForceLeaveCommand - 2019/12/30 19:05:38.233899 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestForceLeaveCommand - 2019/12/30 19:05:38.234209 [ERR] consul: failed to establish leadership: raft is already shutdown
TestForceLeaveCommand - 2019/12/30 19:05:38.253595 [INFO] agent: Force leaving node: Node db2319e5-a254-0b27-19a5-503d0f068b04
TestForceLeaveCommand - 2019/12/30 19:05:38.482197 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/30 19:05:38.485555 [WARN] consul: error getting server health from "Node db2319e5-a254-0b27-19a5-503d0f068b04": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:44512: connect: connection refused
TestForceLeaveCommand - 2019/12/30 19:05:38.609447 [INFO] memberlist: Suspect Node db2319e5-a254-0b27-19a5-503d0f068b04 has failed, no acks received
TestForceLeaveCommand - 2019/12/30 19:05:39.445315 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestForceLeaveCommand - 2019/12/30 19:05:39.445494 [DEBUG] agent: Node info in sync
TestForceLeaveCommand - 2019/12/30 19:05:39.482161 [WARN] consul: error getting server health from "Node db2319e5-a254-0b27-19a5-503d0f068b04": context deadline exceeded
TestForceLeaveCommand - 2019/12/30 19:05:40.104146 [DEBUG] http: Request PUT /v1/agent/force-leave/Node%20db2319e5-a254-0b27-19a5-503d0f068b04 (1.850507065s) from=127.0.0.1:44056
TestForceLeaveCommand - 2019/12/30 19:05:40.110248 [DEBUG] memberlist: Failed ping: Node db2319e5-a254-0b27-19a5-503d0f068b04 (timeout reached)
TestForceLeaveCommand - 2019/12/30 19:05:40.490684 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/30 19:05:40.491452 [WARN] consul: error getting server health from "Node db2319e5-a254-0b27-19a5-503d0f068b04": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:44512: connect: connection refused
TestForceLeaveCommand - 2019/12/30 19:05:41.490566 [WARN] consul: error getting server health from "Node db2319e5-a254-0b27-19a5-503d0f068b04": context deadline exceeded
TestForceLeaveCommand - 2019/12/30 19:05:41.609341 [INFO] memberlist: Suspect Node db2319e5-a254-0b27-19a5-503d0f068b04 has failed, no acks received
TestForceLeaveCommand - 2019/12/30 19:05:42.481841 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/30 19:05:42.482647 [WARN] consul: error getting server health from "Node db2319e5-a254-0b27-19a5-503d0f068b04": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:44512: connect: connection refused
TestForceLeaveCommand - 2019/12/30 19:05:42.609970 [INFO] memberlist: Marking Node db2319e5-a254-0b27-19a5-503d0f068b04 as failed, suspect timeout reached (0 peer confirmations)
TestForceLeaveCommand - 2019/12/30 19:05:42.610299 [INFO] serf: EventMemberLeave: Node db2319e5-a254-0b27-19a5-503d0f068b04 127.0.0.1
TestForceLeaveCommand - 2019/12/30 19:05:42.610531 [INFO] consul: Removing LAN server Node db2319e5-a254-0b27-19a5-503d0f068b04 (Addr: tcp/127.0.0.1:44512) (DC: dc1)
TestForceLeaveCommand - 2019/12/30 19:05:42.610938 [INFO] consul: member 'Node db2319e5-a254-0b27-19a5-503d0f068b04' left, deregistering
TestForceLeaveCommand - 2019/12/30 19:05:42.620255 [INFO] agent: Requesting shutdown
TestForceLeaveCommand - 2019/12/30 19:05:42.620339 [INFO] consul: shutting down server
TestForceLeaveCommand - 2019/12/30 19:05:42.620384 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/30 19:05:42.905469 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/30 19:05:43.110205 [DEBUG] memberlist: Failed ping: Node db2319e5-a254-0b27-19a5-503d0f068b04 (timeout reached)
TestForceLeaveCommand - 2019/12/30 19:05:43.140592 [INFO] manager: shutting down
TestForceLeaveCommand - 2019/12/30 19:05:43.141422 [INFO] agent: consul server down
TestForceLeaveCommand - 2019/12/30 19:05:43.141495 [INFO] agent: shutdown complete
TestForceLeaveCommand - 2019/12/30 19:05:43.141560 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:43.141722 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (udp)
TestForceLeaveCommand - 2019/12/30 19:05:43.141958 [INFO] agent: Stopping HTTP server 127.0.0.1:44502 (tcp)
TestForceLeaveCommand - 2019/12/30 19:05:43.142560 [INFO] agent: Waiting for endpoints to shut down
TestForceLeaveCommand - 2019/12/30 19:05:43.142678 [INFO] agent: Endpoints down
--- PASS: TestForceLeaveCommand (17.59s)
PASS
ok  	github.com/hashicorp/consul/command/forceleave	18.659s
?   	github.com/hashicorp/consul/command/helpers	[no test files]
=== RUN   TestInfoCommand_noTabs
=== PAUSE TestInfoCommand_noTabs
=== RUN   TestInfoCommand
=== PAUSE TestInfoCommand
=== CONT  TestInfoCommand_noTabs
=== CONT  TestInfoCommand
--- PASS: TestInfoCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestInfoCommand - 2019/12/30 19:05:31.398549 [WARN] agent: Node name "Node 399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestInfoCommand - 2019/12/30 19:05:31.399757 [DEBUG] tlsutil: Update with version 1
TestInfoCommand - 2019/12/30 19:05:31.406555 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:05:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f Address:127.0.0.1:10006}]
2019/12/30 19:05:34 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestInfoCommand - 2019/12/30 19:05:34.204296 [INFO] serf: EventMemberJoin: Node 399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f.dc1 127.0.0.1
TestInfoCommand - 2019/12/30 19:05:34.215261 [INFO] serf: EventMemberJoin: Node 399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f 127.0.0.1
TestInfoCommand - 2019/12/30 19:05:34.218793 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestInfoCommand - 2019/12/30 19:05:34.220329 [INFO] consul: Adding LAN server Node 399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestInfoCommand - 2019/12/30 19:05:34.220671 [INFO] consul: Handled member-join event for server "Node 399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f.dc1" in area "wan"
TestInfoCommand - 2019/12/30 19:05:34.221394 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestInfoCommand - 2019/12/30 19:05:34.226521 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestInfoCommand - 2019/12/30 19:05:34.226717 [INFO] agent: started state syncer
2019/12/30 19:05:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:05:34 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/30 19:05:36 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:05:36 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestInfoCommand - 2019/12/30 19:05:36.897124 [INFO] consul: cluster leadership acquired
TestInfoCommand - 2019/12/30 19:05:36.897661 [INFO] consul: New leader elected: Node 399ccfdb-7b43-f4b2-6f98-f99d5ed3f82f
TestInfoCommand - 2019/12/30 19:05:37.514956 [INFO] agent: Synced node info
TestInfoCommand - 2019/12/30 19:05:37.515345 [DEBUG] agent: Node info in sync
TestInfoCommand - 2019/12/30 19:05:37.534005 [DEBUG] http: Request GET /v1/agent/self (480.028728ms) from=127.0.0.1:52390
TestInfoCommand - 2019/12/30 19:05:37.549730 [INFO] agent: Requesting shutdown
TestInfoCommand - 2019/12/30 19:05:37.549829 [INFO] consul: shutting down server
TestInfoCommand - 2019/12/30 19:05:37.549881 [WARN] serf: Shutdown without a Leave
TestInfoCommand - 2019/12/30 19:05:37.588232 [DEBUG] agent: Node info in sync
TestInfoCommand - 2019/12/30 19:05:37.942132 [WARN] serf: Shutdown without a Leave
TestInfoCommand - 2019/12/30 19:05:38.122081 [INFO] manager: shutting down
TestInfoCommand - 2019/12/30 19:05:38.230646 [INFO] agent: consul server down
TestInfoCommand - 2019/12/30 19:05:38.230959 [INFO] agent: shutdown complete
TestInfoCommand - 2019/12/30 19:05:38.231140 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestInfoCommand - 2019/12/30 19:05:38.231337 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestInfoCommand - 2019/12/30 19:05:38.231515 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestInfoCommand - 2019/12/30 19:05:38.232314 [INFO] agent: Waiting for endpoints to shut down
TestInfoCommand - 2019/12/30 19:05:38.232501 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestInfoCommand - 2019/12/30 19:05:38.232817 [INFO] agent: Endpoints down
--- PASS: TestInfoCommand (9.31s)
PASS
ok  	github.com/hashicorp/consul/command/info	9.626s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/intention	0.041s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== CONT  TestCommand_noTabs
=== CONT  TestCommand
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/1_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/1_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
--- PASS: TestCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/12/30 19:05:56.802345 [WARN] agent: Node name "Node a58c1718-1e63-7b11-50b0-d41c6823ebfd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/12/30 19:05:56.803217 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/12/30 19:05:56.812135 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:05:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a58c1718-1e63-7b11-50b0-d41c6823ebfd Address:127.0.0.1:43006}]
2019/12/30 19:05:57 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/12/30 19:05:57.651907 [INFO] serf: EventMemberJoin: Node a58c1718-1e63-7b11-50b0-d41c6823ebfd.dc1 127.0.0.1
TestCommand - 2019/12/30 19:05:57.655796 [INFO] serf: EventMemberJoin: Node a58c1718-1e63-7b11-50b0-d41c6823ebfd 127.0.0.1
TestCommand - 2019/12/30 19:05:57.657507 [INFO] consul: Handled member-join event for server "Node a58c1718-1e63-7b11-50b0-d41c6823ebfd.dc1" in area "wan"
TestCommand - 2019/12/30 19:05:57.657715 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestCommand - 2019/12/30 19:05:57.658047 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestCommand - 2019/12/30 19:05:57.658124 [INFO] consul: Adding LAN server Node a58c1718-1e63-7b11-50b0-d41c6823ebfd (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestCommand - 2019/12/30 19:05:57.661009 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestCommand - 2019/12/30 19:05:57.661160 [INFO] agent: started state syncer
2019/12/30 19:05:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:05:57 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/12/30 19:05:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:05:58 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestCommand - 2019/12/30 19:05:58.156043 [INFO] consul: cluster leadership acquired
TestCommand - 2019/12/30 19:05:58.156613 [INFO] consul: New leader elected: Node a58c1718-1e63-7b11-50b0-d41c6823ebfd
TestCommand - 2019/12/30 19:05:58.723562 [INFO] agent: Synced node info
TestCommand - 2019/12/30 19:05:58.723701 [DEBUG] agent: Node info in sync
TestCommand - 2019/12/30 19:05:58.726204 [DEBUG] http: Request POST /v1/connect/intentions (293.497381ms) from=127.0.0.1:35340
TestCommand - 2019/12/30 19:05:58.734693 [DEBUG] http: Request GET /v1/connect/intentions/check?destination=db&source=foo&source-type=consul (1.563043ms) from=127.0.0.1:35342
TestCommand - 2019/12/30 19:05:58.746793 [DEBUG] http: Request GET /v1/connect/intentions/check?destination=db&source=web&source-type=consul (1.041696ms) from=127.0.0.1:35344
TestCommand - 2019/12/30 19:05:58.758820 [INFO] agent: Requesting shutdown
TestCommand - 2019/12/30 19:05:58.758914 [INFO] consul: shutting down server
TestCommand - 2019/12/30 19:05:58.758960 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/30 19:05:58.914046 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/30 19:05:59.048126 [INFO] manager: shutting down
TestCommand - 2019/12/30 19:05:59.050699 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand - 2019/12/30 19:05:59.051131 [INFO] agent: consul server down
TestCommand - 2019/12/30 19:05:59.051188 [INFO] agent: shutdown complete
TestCommand - 2019/12/30 19:05:59.051251 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestCommand - 2019/12/30 19:05:59.051383 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestCommand - 2019/12/30 19:05:59.051524 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestCommand - 2019/12/30 19:05:59.052327 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/12/30 19:05:59.052502 [INFO] agent: Endpoints down
--- PASS: TestCommand (2.46s)
PASS
ok  	github.com/hashicorp/consul/command/intention/check	3.048s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== RUN   TestCommand_deny
=== PAUSE TestCommand_deny
=== RUN   TestCommand_meta
=== PAUSE TestCommand_meta
=== RUN   TestCommand_File
=== PAUSE TestCommand_File
=== RUN   TestCommand_FileNoExist
=== PAUSE TestCommand_FileNoExist
=== RUN   TestCommand_replace
=== PAUSE TestCommand_replace
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_File
=== CONT  TestCommand_deny
=== CONT  TestCommand_replace
=== CONT  TestCommand_FileNoExist
--- PASS: TestCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_replace - 2019/12/30 19:05:59.640661 [WARN] agent: Node name "Node 95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_FileNoExist - 2019/12/30 19:05:59.650796 [WARN] agent: Node name "Node 349df538-135e-8ddf-1bf9-e305ca46c344" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_FileNoExist - 2019/12/30 19:05:59.651250 [DEBUG] tlsutil: Update with version 1
TestCommand_replace - 2019/12/30 19:05:59.651730 [DEBUG] tlsutil: Update with version 1
TestCommand_replace - 2019/12/30 19:05:59.658018 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_FileNoExist - 2019/12/30 19:05:59.660180 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_deny - 2019/12/30 19:05:59.681727 [WARN] agent: Node name "Node a86cca6e-37b4-0733-c067-601f81fea459" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_deny - 2019/12/30 19:05:59.682434 [DEBUG] tlsutil: Update with version 1
TestCommand_deny - 2019/12/30 19:05:59.685452 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File - 2019/12/30 19:05:59.689725 [WARN] agent: Node name "Node 3654e2de-6ed1-a527-96f3-b0461e6bf989" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File - 2019/12/30 19:05:59.690251 [DEBUG] tlsutil: Update with version 1
TestCommand_File - 2019/12/30 19:05:59.706669 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:06:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a86cca6e-37b4-0733-c067-601f81fea459 Address:127.0.0.1:49018}]
2019/12/30 19:06:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:349df538-135e-8ddf-1bf9-e305ca46c344 Address:127.0.0.1:49024}]
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49024 [Follower] entering Follower state (Leader: "")
TestCommand_FileNoExist - 2019/12/30 19:06:00.753052 [INFO] serf: EventMemberJoin: Node 349df538-135e-8ddf-1bf9-e305ca46c344.dc1 127.0.0.1
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49018 [Follower] entering Follower state (Leader: "")
TestCommand_deny - 2019/12/30 19:06:00.755070 [INFO] serf: EventMemberJoin: Node a86cca6e-37b4-0733-c067-601f81fea459.dc1 127.0.0.1
TestCommand_FileNoExist - 2019/12/30 19:06:00.758774 [INFO] serf: EventMemberJoin: Node 349df538-135e-8ddf-1bf9-e305ca46c344 127.0.0.1
TestCommand_deny - 2019/12/30 19:06:00.758774 [INFO] serf: EventMemberJoin: Node a86cca6e-37b4-0733-c067-601f81fea459 127.0.0.1
TestCommand_deny - 2019/12/30 19:06:00.760854 [INFO] consul: Handled member-join event for server "Node a86cca6e-37b4-0733-c067-601f81fea459.dc1" in area "wan"
TestCommand_deny - 2019/12/30 19:06:00.761291 [INFO] consul: Adding LAN server Node a86cca6e-37b4-0733-c067-601f81fea459 (Addr: tcp/127.0.0.1:49018) (DC: dc1)
TestCommand_deny - 2019/12/30 19:06:00.761297 [INFO] agent: Started DNS server 127.0.0.1:49013 (udp)
TestCommand_deny - 2019/12/30 19:06:00.761748 [INFO] agent: Started DNS server 127.0.0.1:49013 (tcp)
TestCommand_FileNoExist - 2019/12/30 19:06:00.762094 [INFO] consul: Adding LAN server Node 349df538-135e-8ddf-1bf9-e305ca46c344 (Addr: tcp/127.0.0.1:49024) (DC: dc1)
TestCommand_deny - 2019/12/30 19:06:00.764496 [INFO] agent: Started HTTP server on 127.0.0.1:49014 (tcp)
TestCommand_deny - 2019/12/30 19:06:00.764705 [INFO] agent: started state syncer
TestCommand_FileNoExist - 2019/12/30 19:06:00.766905 [INFO] consul: Handled member-join event for server "Node 349df538-135e-8ddf-1bf9-e305ca46c344.dc1" in area "wan"
TestCommand_FileNoExist - 2019/12/30 19:06:00.767981 [INFO] agent: Started DNS server 127.0.0.1:49019 (udp)
TestCommand_FileNoExist - 2019/12/30 19:06:00.768397 [INFO] agent: Started DNS server 127.0.0.1:49019 (tcp)
TestCommand_FileNoExist - 2019/12/30 19:06:00.770847 [INFO] agent: Started HTTP server on 127.0.0.1:49020 (tcp)
TestCommand_FileNoExist - 2019/12/30 19:06:00.770959 [INFO] agent: started state syncer
2019/12/30 19:06:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49024 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49018 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2 Address:127.0.0.1:49012}]
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49012 [Follower] entering Follower state (Leader: "")
2019/12/30 19:06:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3654e2de-6ed1-a527-96f3-b0461e6bf989 Address:127.0.0.1:49006}]
TestCommand_replace - 2019/12/30 19:06:00.868528 [INFO] serf: EventMemberJoin: Node 95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2.dc1 127.0.0.1
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49006 [Follower] entering Follower state (Leader: "")
TestCommand_File - 2019/12/30 19:06:00.868528 [INFO] serf: EventMemberJoin: Node 3654e2de-6ed1-a527-96f3-b0461e6bf989.dc1 127.0.0.1
TestCommand_replace - 2019/12/30 19:06:00.891827 [INFO] serf: EventMemberJoin: Node 95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2 127.0.0.1
TestCommand_replace - 2019/12/30 19:06:00.893380 [INFO] agent: Started DNS server 127.0.0.1:49007 (udp)
TestCommand_replace - 2019/12/30 19:06:00.893980 [INFO] consul: Adding LAN server Node 95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2 (Addr: tcp/127.0.0.1:49012) (DC: dc1)
TestCommand_replace - 2019/12/30 19:06:00.894224 [INFO] consul: Handled member-join event for server "Node 95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2.dc1" in area "wan"
TestCommand_File - 2019/12/30 19:06:00.895375 [INFO] serf: EventMemberJoin: Node 3654e2de-6ed1-a527-96f3-b0461e6bf989 127.0.0.1
TestCommand_replace - 2019/12/30 19:06:00.900211 [INFO] agent: Started DNS server 127.0.0.1:49007 (tcp)
TestCommand_File - 2019/12/30 19:06:00.900288 [INFO] agent: Started DNS server 127.0.0.1:49001 (udp)
TestCommand_replace - 2019/12/30 19:06:00.904866 [INFO] agent: Started HTTP server on 127.0.0.1:49008 (tcp)
TestCommand_replace - 2019/12/30 19:06:00.905090 [INFO] agent: started state syncer
TestCommand_File - 2019/12/30 19:06:00.909310 [INFO] consul: Adding LAN server Node 3654e2de-6ed1-a527-96f3-b0461e6bf989 (Addr: tcp/127.0.0.1:49006) (DC: dc1)
TestCommand_File - 2019/12/30 19:06:00.910502 [INFO] consul: Handled member-join event for server "Node 3654e2de-6ed1-a527-96f3-b0461e6bf989.dc1" in area "wan"
TestCommand_File - 2019/12/30 19:06:00.914369 [INFO] agent: Started DNS server 127.0.0.1:49001 (tcp)
2019/12/30 19:06:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49012 [Candidate] entering Candidate state in term 2
TestCommand_File - 2019/12/30 19:06:00.922101 [INFO] agent: Started HTTP server on 127.0.0.1:49002 (tcp)
TestCommand_File - 2019/12/30 19:06:00.922277 [INFO] agent: started state syncer
2019/12/30 19:06:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:00 [INFO]  raft: Node at 127.0.0.1:49006 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:01 [INFO]  raft: Node at 127.0.0.1:49018 [Leader] entering Leader state
2019/12/30 19:06:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:01 [INFO]  raft: Node at 127.0.0.1:49024 [Leader] entering Leader state
TestCommand_deny - 2019/12/30 19:06:01.631150 [INFO] consul: cluster leadership acquired
TestCommand_deny - 2019/12/30 19:06:01.631734 [INFO] consul: New leader elected: Node a86cca6e-37b4-0733-c067-601f81fea459
TestCommand_FileNoExist - 2019/12/30 19:06:01.632042 [INFO] consul: cluster leadership acquired
TestCommand_FileNoExist - 2019/12/30 19:06:01.632449 [INFO] consul: New leader elected: Node 349df538-135e-8ddf-1bf9-e305ca46c344
TestCommand_FileNoExist - 2019/12/30 19:06:01.717974 [INFO] agent: Requesting shutdown
TestCommand_FileNoExist - 2019/12/30 19:06:01.718115 [INFO] consul: shutting down server
TestCommand_FileNoExist - 2019/12/30 19:06:01.718170 [WARN] serf: Shutdown without a Leave
2019/12/30 19:06:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:01 [INFO]  raft: Node at 127.0.0.1:49012 [Leader] entering Leader state
TestCommand_replace - 2019/12/30 19:06:01.759133 [INFO] consul: cluster leadership acquired
TestCommand_replace - 2019/12/30 19:06:01.759663 [INFO] consul: New leader elected: Node 95e3e0f7-85da-b4e4-9b8b-fb435f1bf7a2
2019/12/30 19:06:01 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:01 [INFO]  raft: Node at 127.0.0.1:49006 [Leader] entering Leader state
TestCommand_File - 2019/12/30 19:06:01.760982 [INFO] consul: cluster leadership acquired
TestCommand_File - 2019/12/30 19:06:01.761398 [INFO] consul: New leader elected: Node 3654e2de-6ed1-a527-96f3-b0461e6bf989
TestCommand_FileNoExist - 2019/12/30 19:06:01.848540 [WARN] serf: Shutdown without a Leave
TestCommand_FileNoExist - 2019/12/30 19:06:01.915428 [INFO] manager: shutting down
TestCommand_FileNoExist - 2019/12/30 19:06:02.106208 [INFO] agent: consul server down
TestCommand_FileNoExist - 2019/12/30 19:06:02.106301 [INFO] agent: shutdown complete
TestCommand_FileNoExist - 2019/12/30 19:06:02.106363 [INFO] agent: Stopping DNS server 127.0.0.1:49019 (tcp)
TestCommand_FileNoExist - 2019/12/30 19:06:02.106569 [INFO] agent: Stopping DNS server 127.0.0.1:49019 (udp)
TestCommand_FileNoExist - 2019/12/30 19:06:02.106751 [INFO] agent: Stopping HTTP server 127.0.0.1:49020 (tcp)
TestCommand_FileNoExist - 2019/12/30 19:06:02.106800 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommand_FileNoExist - 2019/12/30 19:06:02.106922 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestCommand_FileNoExist - 2019/12/30 19:06:02.106983 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestCommand_FileNoExist - 2019/12/30 19:06:02.106998 [INFO] agent: Waiting for endpoints to shut down
TestCommand_FileNoExist - 2019/12/30 19:06:02.107101 [INFO] agent: Endpoints down
--- PASS: TestCommand_FileNoExist (2.68s)
=== CONT  TestCommand_meta
TestCommand_replace - 2019/12/30 19:06:02.216007 [INFO] agent: Synced node info
TestCommand_replace - 2019/12/30 19:06:02.216140 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_replace - 2019/12/30 19:06:02.236697 [DEBUG] http: Request POST /v1/connect/intentions (381.418789ms) from=127.0.0.1:42222
TestCommand_File - 2019/12/30 19:06:02.238084 [INFO] agent: Synced node info
TestCommand_File - 2019/12/30 19:06:02.238509 [DEBUG] http: Request POST /v1/connect/intentions (303.225645ms) from=127.0.0.1:49158
TestCommand_meta - 2019/12/30 19:06:02.250654 [WARN] agent: Node name "Node 907ac713-16d1-0a38-f37b-a1d7b8ba5955" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_meta - 2019/12/30 19:06:02.251224 [DEBUG] tlsutil: Update with version 1
TestCommand_replace - 2019/12/30 19:06:02.254013 [DEBUG] http: Request GET /v1/connect/intentions (2.725741ms) from=127.0.0.1:42228
TestCommand_File - 2019/12/30 19:06:02.261109 [DEBUG] http: Request GET /v1/connect/intentions (3.193754ms) from=127.0.0.1:49162
TestCommand_File - 2019/12/30 19:06:02.267760 [INFO] agent: Requesting shutdown
TestCommand_File - 2019/12/30 19:06:02.267858 [INFO] consul: shutting down server
TestCommand_File - 2019/12/30 19:06:02.268111 [WARN] serf: Shutdown without a Leave
TestCommand_meta - 2019/12/30 19:06:02.268086 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File - 2019/12/30 19:06:02.447465 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/30 19:06:02.522589 [INFO] manager: shutting down
TestCommand_deny - 2019/12/30 19:06:02.528105 [INFO] agent: Synced node info
TestCommand_deny - 2019/12/30 19:06:02.528825 [DEBUG] http: Request POST /v1/connect/intentions (599.133089ms) from=127.0.0.1:49900
TestCommand_deny - 2019/12/30 19:06:02.533432 [DEBUG] http: Request GET /v1/connect/intentions (1.122365ms) from=127.0.0.1:49910
TestCommand_deny - 2019/12/30 19:06:02.535915 [INFO] agent: Requesting shutdown
TestCommand_deny - 2019/12/30 19:06:02.536042 [INFO] consul: shutting down server
TestCommand_deny - 2019/12/30 19:06:02.536094 [WARN] serf: Shutdown without a Leave
TestCommand_deny - 2019/12/30 19:06:02.632979 [WARN] serf: Shutdown without a Leave
TestCommand_deny - 2019/12/30 19:06:02.730957 [INFO] manager: shutting down
TestCommand_File - 2019/12/30 19:06:02.731204 [INFO] agent: consul server down
TestCommand_File - 2019/12/30 19:06:02.731267 [INFO] agent: shutdown complete
TestCommand_File - 2019/12/30 19:06:02.731333 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (tcp)
TestCommand_File - 2019/12/30 19:06:02.731508 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (udp)
TestCommand_File - 2019/12/30 19:06:02.731685 [INFO] agent: Stopping HTTP server 127.0.0.1:49002 (tcp)
TestCommand_replace - 2019/12/30 19:06:02.731828 [ERR] http: Request POST /v1/connect/intentions, error: duplicate intention found: ALLOW default/foo => default/bar (ID: 2662137d-8d85-0689-95ea-2dba9d9c7c24, Precedence: 9) from=127.0.0.1:42232
TestCommand_File - 2019/12/30 19:06:02.732299 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_File - 2019/12/30 19:06:02.732979 [INFO] agent: Waiting for endpoints to shut down
TestCommand_replace - 2019/12/30 19:06:02.733028 [DEBUG] http: Request POST /v1/connect/intentions (469.4122ms) from=127.0.0.1:42232
TestCommand_File - 2019/12/30 19:06:02.733101 [INFO] agent: Endpoints down
--- PASS: TestCommand_File (3.31s)
=== CONT  TestCommand
TestCommand_replace - 2019/12/30 19:06:02.750826 [DEBUG] http: Request GET /v1/connect/intentions (1.090697ms) from=127.0.0.1:42236
TestCommand_deny - 2019/12/30 19:06:02.823628 [INFO] agent: consul server down
TestCommand_deny - 2019/12/30 19:06:02.823711 [INFO] agent: shutdown complete
TestCommand_deny - 2019/12/30 19:06:02.823774 [INFO] agent: Stopping DNS server 127.0.0.1:49013 (tcp)
TestCommand_deny - 2019/12/30 19:06:02.823943 [INFO] agent: Stopping DNS server 127.0.0.1:49013 (udp)
TestCommand_deny - 2019/12/30 19:06:02.824142 [INFO] agent: Stopping HTTP server 127.0.0.1:49014 (tcp)
TestCommand_deny - 2019/12/30 19:06:02.825375 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_deny - 2019/12/30 19:06:02.825845 [INFO] agent: Waiting for endpoints to shut down
TestCommand_deny - 2019/12/30 19:06:02.825959 [INFO] agent: Endpoints down
--- PASS: TestCommand_deny (3.40s)
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/-allow_and_-deny
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/-allow_and_-deny (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/12/30 19:06:02.861292 [WARN] agent: Node name "Node 52b45ada-5de1-80fd-0035-eb86407df3e6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/12/30 19:06:02.861781 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/12/30 19:06:02.864056 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:06:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:907ac713-16d1-0a38-f37b-a1d7b8ba5955 Address:127.0.0.1:49030}]
2019/12/30 19:06:03 [INFO]  raft: Node at 127.0.0.1:49030 [Follower] entering Follower state (Leader: "")
TestCommand_replace - 2019/12/30 19:06:03.449807 [DEBUG] http: Request PUT /v1/connect/intentions/2662137d-8d85-0689-95ea-2dba9d9c7c24 (696.050745ms) from=127.0.0.1:42236
TestCommand_meta - 2019/12/30 19:06:03.452057 [INFO] serf: EventMemberJoin: Node 907ac713-16d1-0a38-f37b-a1d7b8ba5955.dc1 127.0.0.1
TestCommand_replace - 2019/12/30 19:06:03.457186 [DEBUG] http: Request GET /v1/connect/intentions (779.354µs) from=127.0.0.1:42228
TestCommand_meta - 2019/12/30 19:06:03.457377 [INFO] serf: EventMemberJoin: Node 907ac713-16d1-0a38-f37b-a1d7b8ba5955 127.0.0.1
TestCommand_replace - 2019/12/30 19:06:03.460249 [INFO] agent: Requesting shutdown
TestCommand_replace - 2019/12/30 19:06:03.460344 [INFO] consul: shutting down server
TestCommand_replace - 2019/12/30 19:06:03.460393 [WARN] serf: Shutdown without a Leave
TestCommand_meta - 2019/12/30 19:06:03.462137 [INFO] agent: Started DNS server 127.0.0.1:49025 (udp)
TestCommand_meta - 2019/12/30 19:06:03.462598 [INFO] agent: Started DNS server 127.0.0.1:49025 (tcp)
TestCommand_meta - 2019/12/30 19:06:03.462747 [INFO] consul: Adding LAN server Node 907ac713-16d1-0a38-f37b-a1d7b8ba5955 (Addr: tcp/127.0.0.1:49030) (DC: dc1)
TestCommand_meta - 2019/12/30 19:06:03.462968 [INFO] consul: Handled member-join event for server "Node 907ac713-16d1-0a38-f37b-a1d7b8ba5955.dc1" in area "wan"
TestCommand_meta - 2019/12/30 19:06:03.465318 [INFO] agent: Started HTTP server on 127.0.0.1:49026 (tcp)
TestCommand_meta - 2019/12/30 19:06:03.465774 [INFO] agent: started state syncer
2019/12/30 19:06:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:03 [INFO]  raft: Node at 127.0.0.1:49030 [Candidate] entering Candidate state in term 2
TestCommand_replace - 2019/12/30 19:06:03.639235 [WARN] serf: Shutdown without a Leave
TestCommand_replace - 2019/12/30 19:06:03.714460 [INFO] manager: shutting down
TestCommand_replace - 2019/12/30 19:06:03.714488 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestCommand_replace - 2019/12/30 19:06:03.714750 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommand_replace - 2019/12/30 19:06:03.714830 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestCommand_replace - 2019/12/30 19:06:03.714898 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestCommand_replace - 2019/12/30 19:06:03.714958 [ERR] consul: failed to transfer leadership in 3 attempts
TestCommand_replace - 2019/12/30 19:06:03.714912 [INFO] agent: consul server down
TestCommand_replace - 2019/12/30 19:06:03.715063 [INFO] agent: shutdown complete
TestCommand_replace - 2019/12/30 19:06:03.715135 [INFO] agent: Stopping DNS server 127.0.0.1:49007 (tcp)
TestCommand_replace - 2019/12/30 19:06:03.715444 [INFO] agent: Stopping DNS server 127.0.0.1:49007 (udp)
TestCommand_replace - 2019/12/30 19:06:03.715631 [INFO] agent: Stopping HTTP server 127.0.0.1:49008 (tcp)
TestCommand_replace - 2019/12/30 19:06:03.716655 [INFO] agent: Waiting for endpoints to shut down
TestCommand_replace - 2019/12/30 19:06:03.716753 [INFO] agent: Endpoints down
--- PASS: TestCommand_replace (4.29s)
2019/12/30 19:06:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:52b45ada-5de1-80fd-0035-eb86407df3e6 Address:127.0.0.1:49036}]
2019/12/30 19:06:03 [INFO]  raft: Node at 127.0.0.1:49036 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/12/30 19:06:03.877574 [INFO] serf: EventMemberJoin: Node 52b45ada-5de1-80fd-0035-eb86407df3e6.dc1 127.0.0.1
TestCommand - 2019/12/30 19:06:03.883646 [INFO] serf: EventMemberJoin: Node 52b45ada-5de1-80fd-0035-eb86407df3e6 127.0.0.1
TestCommand - 2019/12/30 19:06:03.885112 [INFO] consul: Adding LAN server Node 52b45ada-5de1-80fd-0035-eb86407df3e6 (Addr: tcp/127.0.0.1:49036) (DC: dc1)
TestCommand - 2019/12/30 19:06:03.885636 [INFO] agent: Started DNS server 127.0.0.1:49031 (udp)
TestCommand - 2019/12/30 19:06:03.885997 [INFO] consul: Handled member-join event for server "Node 52b45ada-5de1-80fd-0035-eb86407df3e6.dc1" in area "wan"
TestCommand - 2019/12/30 19:06:03.886049 [INFO] agent: Started DNS server 127.0.0.1:49031 (tcp)
TestCommand - 2019/12/30 19:06:03.890920 [INFO] agent: Started HTTP server on 127.0.0.1:49032 (tcp)
TestCommand - 2019/12/30 19:06:03.891032 [INFO] agent: started state syncer
2019/12/30 19:06:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:03 [INFO]  raft: Node at 127.0.0.1:49036 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:04 [INFO]  raft: Node at 127.0.0.1:49030 [Leader] entering Leader state
TestCommand_meta - 2019/12/30 19:06:04.231358 [INFO] consul: cluster leadership acquired
TestCommand_meta - 2019/12/30 19:06:04.231921 [INFO] consul: New leader elected: Node 907ac713-16d1-0a38-f37b-a1d7b8ba5955
TestCommand_meta - 2019/12/30 19:06:04.841381 [INFO] agent: Synced node info
TestCommand_meta - 2019/12/30 19:06:04.841511 [DEBUG] agent: Node info in sync
TestCommand_meta - 2019/12/30 19:06:04.842671 [DEBUG] http: Request POST /v1/connect/intentions (511.886696ms) from=127.0.0.1:53762
TestCommand_meta - 2019/12/30 19:06:04.847454 [DEBUG] http: Request GET /v1/connect/intentions (1.111363ms) from=127.0.0.1:53764
TestCommand_meta - 2019/12/30 19:06:04.850080 [INFO] agent: Requesting shutdown
TestCommand_meta - 2019/12/30 19:06:04.850203 [INFO] consul: shutting down server
TestCommand_meta - 2019/12/30 19:06:04.850253 [WARN] serf: Shutdown without a Leave
2019/12/30 19:06:04 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:04 [INFO]  raft: Node at 127.0.0.1:49036 [Leader] entering Leader state
TestCommand - 2019/12/30 19:06:04.914609 [INFO] consul: cluster leadership acquired
TestCommand - 2019/12/30 19:06:04.915149 [INFO] consul: New leader elected: Node 52b45ada-5de1-80fd-0035-eb86407df3e6
TestCommand_meta - 2019/12/30 19:06:04.997482 [WARN] serf: Shutdown without a Leave
TestCommand_meta - 2019/12/30 19:06:05.072521 [INFO] manager: shutting down
TestCommand_meta - 2019/12/30 19:06:05.073016 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommand_meta - 2019/12/30 19:06:05.073400 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_meta - 2019/12/30 19:06:05.074518 [INFO] agent: consul server down
TestCommand_meta - 2019/12/30 19:06:05.074610 [INFO] agent: shutdown complete
TestCommand_meta - 2019/12/30 19:06:05.074672 [INFO] agent: Stopping DNS server 127.0.0.1:49025 (tcp)
TestCommand_meta - 2019/12/30 19:06:05.074829 [INFO] agent: Stopping DNS server 127.0.0.1:49025 (udp)
TestCommand_meta - 2019/12/30 19:06:05.074998 [INFO] agent: Stopping HTTP server 127.0.0.1:49026 (tcp)
TestCommand_meta - 2019/12/30 19:06:05.075720 [INFO] agent: Waiting for endpoints to shut down
TestCommand_meta - 2019/12/30 19:06:05.075936 [INFO] agent: Endpoints down
--- PASS: TestCommand_meta (2.97s)
TestCommand - 2019/12/30 19:06:05.215218 [INFO] agent: Synced node info
TestCommand - 2019/12/30 19:06:05.215370 [DEBUG] agent: Node info in sync
TestCommand - 2019/12/30 19:06:05.722014 [DEBUG] http: Request POST /v1/connect/intentions (466.210776ms) from=127.0.0.1:46982
TestCommand - 2019/12/30 19:06:05.739144 [DEBUG] http: Request GET /v1/connect/intentions (1.154365ms) from=127.0.0.1:46984
TestCommand - 2019/12/30 19:06:05.742244 [INFO] agent: Requesting shutdown
TestCommand - 2019/12/30 19:06:05.742354 [INFO] consul: shutting down server
TestCommand - 2019/12/30 19:06:05.742409 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/30 19:06:05.822586 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/30 19:06:05.948853 [INFO] manager: shutting down
TestCommand - 2019/12/30 19:06:05.997786 [INFO] agent: consul server down
TestCommand - 2019/12/30 19:06:05.997871 [INFO] agent: shutdown complete
TestCommand - 2019/12/30 19:06:05.997937 [INFO] agent: Stopping DNS server 127.0.0.1:49031 (tcp)
TestCommand - 2019/12/30 19:06:05.998073 [INFO] agent: Stopping DNS server 127.0.0.1:49031 (udp)
TestCommand - 2019/12/30 19:06:05.998222 [INFO] agent: Stopping HTTP server 127.0.0.1:49032 (tcp)
TestCommand - 2019/12/30 19:06:05.998881 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/12/30 19:06:05.999002 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCommand - 2019/12/30 19:06:05.999140 [INFO] agent: Endpoints down
--- PASS: TestCommand (3.27s)
PASS
ok  	github.com/hashicorp/consul/command/intention/create	7.072s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== CONT  TestCommand_noTabs
=== CONT  TestCommand
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_noTabs (0.02s)
--- PASS: TestCommand_Validation (0.02s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/12/30 19:06:32.936374 [WARN] agent: Node name "Node df4d9557-20f1-b1ab-c7bc-424a1d37a8d9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/12/30 19:06:32.952618 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/12/30 19:06:32.962940 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:06:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:df4d9557-20f1-b1ab-c7bc-424a1d37a8d9 Address:127.0.0.1:43006}]
2019/12/30 19:06:33 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/12/30 19:06:33.720733 [INFO] serf: EventMemberJoin: Node df4d9557-20f1-b1ab-c7bc-424a1d37a8d9.dc1 127.0.0.1
TestCommand - 2019/12/30 19:06:33.725793 [INFO] serf: EventMemberJoin: Node df4d9557-20f1-b1ab-c7bc-424a1d37a8d9 127.0.0.1
TestCommand - 2019/12/30 19:06:33.727039 [INFO] consul: Adding LAN server Node df4d9557-20f1-b1ab-c7bc-424a1d37a8d9 (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestCommand - 2019/12/30 19:06:33.730122 [INFO] consul: Handled member-join event for server "Node df4d9557-20f1-b1ab-c7bc-424a1d37a8d9.dc1" in area "wan"
TestCommand - 2019/12/30 19:06:33.730902 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestCommand - 2019/12/30 19:06:33.731404 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestCommand - 2019/12/30 19:06:33.734631 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestCommand - 2019/12/30 19:06:33.734914 [INFO] agent: started state syncer
2019/12/30 19:06:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:33 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:34 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestCommand - 2019/12/30 19:06:34.265877 [INFO] consul: cluster leadership acquired
TestCommand - 2019/12/30 19:06:34.266645 [INFO] consul: New leader elected: Node df4d9557-20f1-b1ab-c7bc-424a1d37a8d9
TestCommand - 2019/12/30 19:06:34.809310 [INFO] agent: Synced node info
TestCommand - 2019/12/30 19:06:34.829575 [DEBUG] http: Request POST /v1/connect/intentions (467.986796ms) from=127.0.0.1:35374
TestCommand - 2019/12/30 19:06:34.843876 [DEBUG] http: Request GET /v1/connect/intentions (2.246728ms) from=127.0.0.1:35376
TestCommand - 2019/12/30 19:06:35.776130 [DEBUG] http: Request DELETE /v1/connect/intentions/60ae8810-63f1-ba04-0bb5-0e2086f20827 (918.224438ms) from=127.0.0.1:35376
TestCommand - 2019/12/30 19:06:35.780430 [DEBUG] http: Request GET /v1/connect/intentions (1.270035ms) from=127.0.0.1:35374
TestCommand - 2019/12/30 19:06:35.782221 [INFO] agent: Requesting shutdown
TestCommand - 2019/12/30 19:06:35.782316 [INFO] consul: shutting down server
TestCommand - 2019/12/30 19:06:35.782364 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/30 19:06:35.932177 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/30 19:06:36.023496 [INFO] manager: shutting down
TestCommand - 2019/12/30 19:06:36.024355 [INFO] agent: consul server down
TestCommand - 2019/12/30 19:06:36.024671 [INFO] agent: shutdown complete
TestCommand - 2019/12/30 19:06:36.024845 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestCommand - 2019/12/30 19:06:36.025101 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestCommand - 2019/12/30 19:06:36.025345 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestCommand - 2019/12/30 19:06:36.025944 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/12/30 19:06:36.026151 [INFO] agent: Endpoints down
--- PASS: TestCommand (3.18s)
PASS
ok  	github.com/hashicorp/consul/command/intention/delete	3.514s
=== RUN   TestFinder
=== PAUSE TestFinder
=== CONT  TestFinder
WARNING: bootstrap = true: do not enable unless necessary
TestFinder - 2019/12/30 19:06:33.976718 [WARN] agent: Node name "Node f57738e2-10e3-9076-5543-f91363e48186" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestFinder - 2019/12/30 19:06:33.978270 [DEBUG] tlsutil: Update with version 1
TestFinder - 2019/12/30 19:06:33.985540 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:06:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f57738e2-10e3-9076-5543-f91363e48186 Address:127.0.0.1:47506}]
2019/12/30 19:06:35 [INFO]  raft: Node at 127.0.0.1:47506 [Follower] entering Follower state (Leader: "")
TestFinder - 2019/12/30 19:06:35.508456 [INFO] serf: EventMemberJoin: Node f57738e2-10e3-9076-5543-f91363e48186.dc1 127.0.0.1
TestFinder - 2019/12/30 19:06:35.522328 [INFO] serf: EventMemberJoin: Node f57738e2-10e3-9076-5543-f91363e48186 127.0.0.1
TestFinder - 2019/12/30 19:06:35.524006 [INFO] consul: Adding LAN server Node f57738e2-10e3-9076-5543-f91363e48186 (Addr: tcp/127.0.0.1:47506) (DC: dc1)
TestFinder - 2019/12/30 19:06:35.524725 [INFO] consul: Handled member-join event for server "Node f57738e2-10e3-9076-5543-f91363e48186.dc1" in area "wan"
TestFinder - 2019/12/30 19:06:35.527785 [INFO] agent: Started DNS server 127.0.0.1:47501 (udp)
TestFinder - 2019/12/30 19:06:35.529026 [INFO] agent: Started DNS server 127.0.0.1:47501 (tcp)
TestFinder - 2019/12/30 19:06:35.533152 [INFO] agent: Started HTTP server on 127.0.0.1:47502 (tcp)
TestFinder - 2019/12/30 19:06:35.533453 [INFO] agent: started state syncer
2019/12/30 19:06:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:35 [INFO]  raft: Node at 127.0.0.1:47506 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:36 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:36 [INFO]  raft: Node at 127.0.0.1:47506 [Leader] entering Leader state
TestFinder - 2019/12/30 19:06:36.623896 [INFO] consul: cluster leadership acquired
TestFinder - 2019/12/30 19:06:36.624445 [INFO] consul: New leader elected: Node f57738e2-10e3-9076-5543-f91363e48186
TestFinder - 2019/12/30 19:06:36.949259 [INFO] agent: Synced node info
TestFinder - 2019/12/30 19:06:37.259785 [DEBUG] http: Request POST /v1/connect/intentions (242.92864ms) from=127.0.0.1:40890
TestFinder - 2019/12/30 19:06:37.268649 [DEBUG] http: Request GET /v1/connect/intentions (2.354731ms) from=127.0.0.1:40890
TestFinder - 2019/12/30 19:06:37.271213 [INFO] agent: Requesting shutdown
TestFinder - 2019/12/30 19:06:37.271337 [INFO] consul: shutting down server
TestFinder - 2019/12/30 19:06:37.271466 [WARN] serf: Shutdown without a Leave
TestFinder - 2019/12/30 19:06:37.399888 [WARN] serf: Shutdown without a Leave
TestFinder - 2019/12/30 19:06:37.531750 [INFO] manager: shutting down
TestFinder - 2019/12/30 19:06:37.582831 [INFO] agent: consul server down
TestFinder - 2019/12/30 19:06:37.582910 [INFO] agent: shutdown complete
TestFinder - 2019/12/30 19:06:37.582970 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (tcp)
TestFinder - 2019/12/30 19:06:37.583123 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (udp)
TestFinder - 2019/12/30 19:06:37.583270 [INFO] agent: Stopping HTTP server 127.0.0.1:47502 (tcp)
TestFinder - 2019/12/30 19:06:37.583704 [INFO] agent: Waiting for endpoints to shut down
TestFinder - 2019/12/30 19:06:37.583792 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestFinder - 2019/12/30 19:06:37.583954 [INFO] agent: Endpoints down
--- PASS: TestFinder (3.69s)
PASS
ok  	github.com/hashicorp/consul/command/intention/finder	3.973s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_id
=== PAUSE TestCommand_id
=== RUN   TestCommand_srcDst
=== PAUSE TestCommand_srcDst
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestCommand_srcDst
=== CONT  TestCommand_id
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_id - 2019/12/30 19:06:48.748313 [WARN] agent: Node name "Node 8af0b9e3-9453-a91a-20c7-1e835feee1d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_id - 2019/12/30 19:06:48.750217 [DEBUG] tlsutil: Update with version 1
TestCommand_srcDst - 2019/12/30 19:06:48.749291 [WARN] agent: Node name "Node b5539549-0fa3-b494-b4ea-0f972e561ade" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_srcDst - 2019/12/30 19:06:48.751718 [DEBUG] tlsutil: Update with version 1
TestCommand_srcDst - 2019/12/30 19:06:48.758139 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_id - 2019/12/30 19:06:48.764757 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:06:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8af0b9e3-9453-a91a-20c7-1e835feee1d7 Address:127.0.0.1:46012}]
2019/12/30 19:06:49 [INFO]  raft: Node at 127.0.0.1:46012 [Follower] entering Follower state (Leader: "")
TestCommand_id - 2019/12/30 19:06:49.704228 [INFO] serf: EventMemberJoin: Node 8af0b9e3-9453-a91a-20c7-1e835feee1d7.dc1 127.0.0.1
TestCommand_id - 2019/12/30 19:06:49.708310 [INFO] serf: EventMemberJoin: Node 8af0b9e3-9453-a91a-20c7-1e835feee1d7 127.0.0.1
TestCommand_id - 2019/12/30 19:06:49.711070 [INFO] agent: Started DNS server 127.0.0.1:46007 (udp)
TestCommand_id - 2019/12/30 19:06:49.711531 [INFO] consul: Adding LAN server Node 8af0b9e3-9453-a91a-20c7-1e835feee1d7 (Addr: tcp/127.0.0.1:46012) (DC: dc1)
TestCommand_id - 2019/12/30 19:06:49.711765 [INFO] consul: Handled member-join event for server "Node 8af0b9e3-9453-a91a-20c7-1e835feee1d7.dc1" in area "wan"
TestCommand_id - 2019/12/30 19:06:49.712352 [INFO] agent: Started DNS server 127.0.0.1:46007 (tcp)
TestCommand_id - 2019/12/30 19:06:49.715139 [INFO] agent: Started HTTP server on 127.0.0.1:46008 (tcp)
TestCommand_id - 2019/12/30 19:06:49.715282 [INFO] agent: started state syncer
2019/12/30 19:06:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:49 [INFO]  raft: Node at 127.0.0.1:46012 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b5539549-0fa3-b494-b4ea-0f972e561ade Address:127.0.0.1:46006}]
2019/12/30 19:06:49 [INFO]  raft: Node at 127.0.0.1:46006 [Follower] entering Follower state (Leader: "")
TestCommand_srcDst - 2019/12/30 19:06:49.771634 [INFO] serf: EventMemberJoin: Node b5539549-0fa3-b494-b4ea-0f972e561ade.dc1 127.0.0.1
TestCommand_srcDst - 2019/12/30 19:06:49.779033 [INFO] serf: EventMemberJoin: Node b5539549-0fa3-b494-b4ea-0f972e561ade 127.0.0.1
TestCommand_srcDst - 2019/12/30 19:06:49.780938 [INFO] agent: Started DNS server 127.0.0.1:46001 (udp)
TestCommand_srcDst - 2019/12/30 19:06:49.783177 [INFO] consul: Adding LAN server Node b5539549-0fa3-b494-b4ea-0f972e561ade (Addr: tcp/127.0.0.1:46006) (DC: dc1)
TestCommand_srcDst - 2019/12/30 19:06:49.784204 [INFO] consul: Handled member-join event for server "Node b5539549-0fa3-b494-b4ea-0f972e561ade.dc1" in area "wan"
TestCommand_srcDst - 2019/12/30 19:06:49.785615 [INFO] agent: Started DNS server 127.0.0.1:46001 (tcp)
TestCommand_srcDst - 2019/12/30 19:06:49.791344 [INFO] agent: Started HTTP server on 127.0.0.1:46002 (tcp)
TestCommand_srcDst - 2019/12/30 19:06:49.791607 [INFO] agent: started state syncer
2019/12/30 19:06:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:49 [INFO]  raft: Node at 127.0.0.1:46006 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:50 [INFO]  raft: Node at 127.0.0.1:46012 [Leader] entering Leader state
TestCommand_id - 2019/12/30 19:06:50.301325 [INFO] consul: cluster leadership acquired
TestCommand_id - 2019/12/30 19:06:50.301904 [INFO] consul: New leader elected: Node 8af0b9e3-9453-a91a-20c7-1e835feee1d7
2019/12/30 19:06:50 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:50 [INFO]  raft: Node at 127.0.0.1:46006 [Leader] entering Leader state
TestCommand_srcDst - 2019/12/30 19:06:50.549192 [INFO] consul: cluster leadership acquired
TestCommand_srcDst - 2019/12/30 19:06:50.549828 [INFO] consul: New leader elected: Node b5539549-0fa3-b494-b4ea-0f972e561ade
TestCommand_id - 2019/12/30 19:06:50.900390 [INFO] agent: Synced node info
TestCommand_id - 2019/12/30 19:06:50.903485 [DEBUG] agent: Node info in sync
TestCommand_id - 2019/12/30 19:06:50.903637 [DEBUG] http: Request POST /v1/connect/intentions (579.120149ms) from=127.0.0.1:60158
TestCommand_id - 2019/12/30 19:06:50.932455 [DEBUG] http: Request GET /v1/connect/intentions/b7138368-455e-d39b-5e75-add74eea1853 (2.032722ms) from=127.0.0.1:60162
TestCommand_id - 2019/12/30 19:06:50.936165 [INFO] agent: Requesting shutdown
TestCommand_id - 2019/12/30 19:06:50.936444 [INFO] consul: shutting down server
TestCommand_id - 2019/12/30 19:06:50.936721 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/12/30 19:06:51.065311 [WARN] serf: Shutdown without a Leave
TestCommand_srcDst - 2019/12/30 19:06:51.093985 [DEBUG] http: Request POST /v1/connect/intentions (480.940134ms) from=127.0.0.1:51166
TestCommand_srcDst - 2019/12/30 19:06:51.095327 [INFO] agent: Synced node info
TestCommand_srcDst - 2019/12/30 19:06:51.095457 [DEBUG] agent: Node info in sync
TestCommand_srcDst - 2019/12/30 19:06:51.137425 [DEBUG] http: Request GET /v1/connect/intentions (27.708423ms) from=127.0.0.1:51170
TestCommand_srcDst - 2019/12/30 19:06:51.144855 [DEBUG] http: Request GET /v1/connect/intentions/4ebf8d9b-d9ca-ad49-a30d-e97e41ae229f (1.476707ms) from=127.0.0.1:51170
TestCommand_srcDst - 2019/12/30 19:06:51.147451 [INFO] agent: Requesting shutdown
TestCommand_srcDst - 2019/12/30 19:06:51.147725 [INFO] consul: shutting down server
TestCommand_srcDst - 2019/12/30 19:06:51.147867 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/12/30 19:06:51.173723 [INFO] manager: shutting down
TestCommand_srcDst - 2019/12/30 19:06:51.278722 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/12/30 19:06:51.440585 [INFO] agent: consul server down
TestCommand_id - 2019/12/30 19:06:51.440792 [INFO] agent: shutdown complete
TestCommand_id - 2019/12/30 19:06:51.440986 [INFO] agent: Stopping DNS server 127.0.0.1:46007 (tcp)
TestCommand_id - 2019/12/30 19:06:51.441202 [INFO] agent: Stopping DNS server 127.0.0.1:46007 (udp)
TestCommand_id - 2019/12/30 19:06:51.441908 [INFO] agent: Stopping HTTP server 127.0.0.1:46008 (tcp)
TestCommand_id - 2019/12/30 19:06:51.440869 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_id - 2019/12/30 19:06:51.442582 [INFO] agent: Waiting for endpoints to shut down
TestCommand_id - 2019/12/30 19:06:51.442668 [INFO] agent: Endpoints down
--- PASS: TestCommand_id (2.86s)
TestCommand_srcDst - 2019/12/30 19:06:51.446474 [INFO] manager: shutting down
TestCommand_srcDst - 2019/12/30 19:06:51.557814 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_srcDst - 2019/12/30 19:06:51.558091 [INFO] agent: consul server down
TestCommand_srcDst - 2019/12/30 19:06:51.560142 [INFO] agent: shutdown complete
TestCommand_srcDst - 2019/12/30 19:06:51.560195 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommand_srcDst - 2019/12/30 19:06:51.560207 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (tcp)
TestCommand_srcDst - 2019/12/30 19:06:51.560407 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (udp)
TestCommand_srcDst - 2019/12/30 19:06:51.560552 [INFO] agent: Stopping HTTP server 127.0.0.1:46002 (tcp)
TestCommand_srcDst - 2019/12/30 19:06:51.561150 [INFO] agent: Waiting for endpoints to shut down
TestCommand_srcDst - 2019/12/30 19:06:51.561266 [INFO] agent: Endpoints down
--- PASS: TestCommand_srcDst (2.98s)
PASS
ok  	github.com/hashicorp/consul/command/intention/get	3.452s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_matchDst
=== PAUSE TestCommand_matchDst
=== RUN   TestCommand_matchSource
=== PAUSE TestCommand_matchSource
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_matchSource
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestCommand_matchDst
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
=== RUN   TestCommand_Validation/both_source_and_dest
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
    --- PASS: TestCommand_Validation/both_source_and_dest (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_matchDst - 2019/12/30 19:06:52.382144 [WARN] agent: Node name "Node e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_matchDst - 2019/12/30 19:06:52.383145 [DEBUG] tlsutil: Update with version 1
TestCommand_matchDst - 2019/12/30 19:06:52.395026 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_matchSource - 2019/12/30 19:06:52.412628 [WARN] agent: Node name "Node 9eed2aad-0d31-5db8-8308-322a39993c76" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_matchSource - 2019/12/30 19:06:52.413058 [DEBUG] tlsutil: Update with version 1
TestCommand_matchSource - 2019/12/30 19:06:52.415390 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:06:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986 Address:127.0.0.1:41512}]
2019/12/30 19:06:53 [INFO]  raft: Node at 127.0.0.1:41512 [Follower] entering Follower state (Leader: "")
TestCommand_matchDst - 2019/12/30 19:06:53.853675 [INFO] serf: EventMemberJoin: Node e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986.dc1 127.0.0.1
TestCommand_matchDst - 2019/12/30 19:06:53.857384 [INFO] serf: EventMemberJoin: Node e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986 127.0.0.1
TestCommand_matchDst - 2019/12/30 19:06:53.858240 [INFO] consul: Handled member-join event for server "Node e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986.dc1" in area "wan"
TestCommand_matchDst - 2019/12/30 19:06:53.858441 [INFO] consul: Adding LAN server Node e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986 (Addr: tcp/127.0.0.1:41512) (DC: dc1)
TestCommand_matchDst - 2019/12/30 19:06:53.859025 [INFO] agent: Started DNS server 127.0.0.1:41507 (tcp)
TestCommand_matchDst - 2019/12/30 19:06:53.859610 [INFO] agent: Started DNS server 127.0.0.1:41507 (udp)
TestCommand_matchDst - 2019/12/30 19:06:53.862506 [INFO] agent: Started HTTP server on 127.0.0.1:41508 (tcp)
TestCommand_matchDst - 2019/12/30 19:06:53.862689 [INFO] agent: started state syncer
2019/12/30 19:06:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9eed2aad-0d31-5db8-8308-322a39993c76 Address:127.0.0.1:41506}]
2019/12/30 19:06:53 [INFO]  raft: Node at 127.0.0.1:41512 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:53 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
TestCommand_matchSource - 2019/12/30 19:06:53.903348 [INFO] serf: EventMemberJoin: Node 9eed2aad-0d31-5db8-8308-322a39993c76.dc1 127.0.0.1
TestCommand_matchSource - 2019/12/30 19:06:53.906700 [INFO] serf: EventMemberJoin: Node 9eed2aad-0d31-5db8-8308-322a39993c76 127.0.0.1
TestCommand_matchSource - 2019/12/30 19:06:53.907554 [INFO] consul: Adding LAN server Node 9eed2aad-0d31-5db8-8308-322a39993c76 (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestCommand_matchSource - 2019/12/30 19:06:53.907640 [INFO] consul: Handled member-join event for server "Node 9eed2aad-0d31-5db8-8308-322a39993c76.dc1" in area "wan"
TestCommand_matchSource - 2019/12/30 19:06:53.908136 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestCommand_matchSource - 2019/12/30 19:06:53.908209 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestCommand_matchSource - 2019/12/30 19:06:53.910704 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestCommand_matchSource - 2019/12/30 19:06:53.910830 [INFO] agent: started state syncer
2019/12/30 19:06:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:06:53 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/12/30 19:06:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:54 [INFO]  raft: Node at 127.0.0.1:41512 [Leader] entering Leader state
TestCommand_matchDst - 2019/12/30 19:06:54.349137 [INFO] consul: cluster leadership acquired
TestCommand_matchDst - 2019/12/30 19:06:54.349830 [INFO] consul: New leader elected: Node e1fa6fdd-c8f8-25c6-4ec3-fe45756e5986
2019/12/30 19:06:54 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:06:54 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestCommand_matchSource - 2019/12/30 19:06:54.435869 [INFO] consul: cluster leadership acquired
TestCommand_matchSource - 2019/12/30 19:06:54.438680 [INFO] consul: New leader elected: Node 9eed2aad-0d31-5db8-8308-322a39993c76
TestCommand_matchSource - 2019/12/30 19:06:55.016741 [INFO] agent: Synced node info
TestCommand_matchSource - 2019/12/30 19:06:55.018581 [DEBUG] http: Request POST /v1/connect/intentions (406.343093ms) from=127.0.0.1:39848
TestCommand_matchDst - 2019/12/30 19:06:55.092259 [INFO] agent: Synced node info
TestCommand_matchDst - 2019/12/30 19:06:55.093301 [DEBUG] http: Request POST /v1/connect/intentions (629.357183ms) from=127.0.0.1:33658
TestCommand_matchSource - 2019/12/30 19:06:55.767863 [DEBUG] http: Request POST /v1/connect/intentions (745.489352ms) from=127.0.0.1:39848
TestCommand_matchDst - 2019/12/30 19:06:55.852255 [DEBUG] http: Request POST /v1/connect/intentions (755.413956ms) from=127.0.0.1:33658
TestCommand_matchSource - 2019/12/30 19:06:56.226749 [DEBUG] http: Request POST /v1/connect/intentions (436.422247ms) from=127.0.0.1:39848
TestCommand_matchSource - 2019/12/30 19:06:56.272277 [DEBUG] http: Request GET /v1/connect/intentions/match?by=source&name=foo (1.784049ms) from=127.0.0.1:39850
TestCommand_matchSource - 2019/12/30 19:06:56.274819 [INFO] agent: Requesting shutdown
TestCommand_matchSource - 2019/12/30 19:06:56.274915 [INFO] consul: shutting down server
TestCommand_matchSource - 2019/12/30 19:06:56.274971 [WARN] serf: Shutdown without a Leave
TestCommand_matchSource - 2019/12/30 19:06:56.381997 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/12/30 19:06:56.383960 [DEBUG] http: Request POST /v1/connect/intentions (528.116083ms) from=127.0.0.1:33658
TestCommand_matchDst - 2019/12/30 19:06:56.390748 [DEBUG] http: Request GET /v1/connect/intentions/match?by=destination&name=db (1.375705ms) from=127.0.0.1:33664
TestCommand_matchDst - 2019/12/30 19:06:56.394552 [INFO] agent: Requesting shutdown
TestCommand_matchDst - 2019/12/30 19:06:56.396863 [INFO] consul: shutting down server
TestCommand_matchDst - 2019/12/30 19:06:56.397096 [WARN] serf: Shutdown without a Leave
TestCommand_matchSource - 2019/12/30 19:06:56.457097 [INFO] manager: shutting down
TestCommand_matchSource - 2019/12/30 19:06:56.457488 [INFO] agent: consul server down
TestCommand_matchSource - 2019/12/30 19:06:56.457556 [INFO] agent: shutdown complete
TestCommand_matchSource - 2019/12/30 19:06:56.457614 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestCommand_matchSource - 2019/12/30 19:06:56.457814 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestCommand_matchSource - 2019/12/30 19:06:56.458014 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestCommand_matchSource - 2019/12/30 19:06:56.458728 [INFO] agent: Waiting for endpoints to shut down
TestCommand_matchSource - 2019/12/30 19:06:56.458927 [INFO] agent: Endpoints down
--- PASS: TestCommand_matchSource (4.20s)
TestCommand_matchDst - 2019/12/30 19:06:56.531976 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/12/30 19:06:56.607102 [INFO] manager: shutting down
TestCommand_matchDst - 2019/12/30 19:06:56.732282 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCommand_matchDst - 2019/12/30 19:06:56.732565 [INFO] agent: consul server down
TestCommand_matchDst - 2019/12/30 19:06:56.732662 [INFO] agent: shutdown complete
TestCommand_matchDst - 2019/12/30 19:06:56.732724 [INFO] agent: Stopping DNS server 127.0.0.1:41507 (tcp)
TestCommand_matchDst - 2019/12/30 19:06:56.732942 [INFO] agent: Stopping DNS server 127.0.0.1:41507 (udp)
TestCommand_matchDst - 2019/12/30 19:06:56.733258 [INFO] agent: Stopping HTTP server 127.0.0.1:41508 (tcp)
TestCommand_matchDst - 2019/12/30 19:06:56.734464 [INFO] agent: Waiting for endpoints to shut down
TestCommand_matchDst - 2019/12/30 19:06:56.734617 [INFO] agent: Endpoints down
--- PASS: TestCommand_matchDst (4.48s)
PASS
ok  	github.com/hashicorp/consul/command/intention/match	4.762s
=== RUN   TestJoinCommand_noTabs
=== PAUSE TestJoinCommand_noTabs
=== RUN   TestJoinCommandJoin_lan
=== PAUSE TestJoinCommandJoin_lan
=== RUN   TestJoinCommand_wan
=== PAUSE TestJoinCommand_wan
=== RUN   TestJoinCommand_noAddrs
=== PAUSE TestJoinCommand_noAddrs
=== CONT  TestJoinCommand_noTabs
--- PASS: TestJoinCommand_noTabs (0.00s)
=== CONT  TestJoinCommand_noAddrs
--- PASS: TestJoinCommand_noAddrs (0.00s)
=== CONT  TestJoinCommand_wan
=== CONT  TestJoinCommandJoin_lan
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommand_wan - 2019/12/30 19:07:23.583336 [WARN] agent: Node name "Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommand_wan - 2019/12/30 19:07:23.584226 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommandJoin_lan - 2019/12/30 19:07:23.601084 [WARN] agent: Node name "Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/12/30 19:07:23.601719 [DEBUG] tlsutil: Update with version 1
TestJoinCommand_wan - 2019/12/30 19:07:23.604807 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/30 19:07:23.605493 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:07:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87 Address:127.0.0.1:25006}]
2019/12/30 19:07:25 [INFO]  raft: Node at 127.0.0.1:25006 [Follower] entering Follower state (Leader: "")
TestJoinCommand_wan - 2019/12/30 19:07:25.113884 [INFO] serf: EventMemberJoin: Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/30 19:07:25.117644 [INFO] serf: EventMemberJoin: Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87 127.0.0.1
TestJoinCommand_wan - 2019/12/30 19:07:25.118756 [INFO] consul: Handled member-join event for server "Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/30 19:07:25.119111 [INFO] consul: Adding LAN server Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87 (Addr: tcp/127.0.0.1:25006) (DC: dc1)
TestJoinCommand_wan - 2019/12/30 19:07:25.119841 [INFO] agent: Started DNS server 127.0.0.1:25001 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:25.119936 [INFO] agent: Started DNS server 127.0.0.1:25001 (udp)
TestJoinCommand_wan - 2019/12/30 19:07:25.123361 [INFO] agent: Started HTTP server on 127.0.0.1:25002 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:25.123511 [INFO] agent: started state syncer
2019/12/30 19:07:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:25 [INFO]  raft: Node at 127.0.0.1:25006 [Candidate] entering Candidate state in term 2
2019/12/30 19:07:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:05fdee6d-0e80-f0b0-e1fb-c28f9133feea Address:127.0.0.1:25012}]
2019/12/30 19:07:25 [INFO]  raft: Node at 127.0.0.1:25012 [Follower] entering Follower state (Leader: "")
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.187539 [INFO] serf: EventMemberJoin: Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.191223 [INFO] serf: EventMemberJoin: Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.192114 [INFO] consul: Handled member-join event for server "Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.192498 [INFO] consul: Adding LAN server Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea (Addr: tcp/127.0.0.1:25012) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.192766 [INFO] agent: Started DNS server 127.0.0.1:25007 (udp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.193129 [INFO] agent: Started DNS server 127.0.0.1:25007 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.195788 [INFO] agent: Started HTTP server on 127.0.0.1:25008 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:25.195926 [INFO] agent: started state syncer
2019/12/30 19:07:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:25 [INFO]  raft: Node at 127.0.0.1:25012 [Candidate] entering Candidate state in term 2
2019/12/30 19:07:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:26 [INFO]  raft: Node at 127.0.0.1:25006 [Leader] entering Leader state
TestJoinCommand_wan - 2019/12/30 19:07:26.168973 [INFO] consul: cluster leadership acquired
TestJoinCommand_wan - 2019/12/30 19:07:26.169992 [INFO] consul: New leader elected: Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87
2019/12/30 19:07:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:26 [INFO]  raft: Node at 127.0.0.1:25012 [Leader] entering Leader state
TestJoinCommandJoin_lan - 2019/12/30 19:07:26.383746 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/12/30 19:07:26.384322 [INFO] consul: New leader elected: Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea
TestJoinCommand_wan - 2019/12/30 19:07:26.550641 [INFO] agent: Synced node info
TestJoinCommand_wan - 2019/12/30 19:07:26.550796 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommand_wan - 2019/12/30 19:07:26.609257 [WARN] agent: Node name "Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommand_wan - 2019/12/30 19:07:26.610207 [DEBUG] tlsutil: Update with version 1
TestJoinCommand_wan - 2019/12/30 19:07:26.613177 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommandJoin_lan - 2019/12/30 19:07:26.694480 [WARN] agent: Node name "Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/12/30 19:07:26.694928 [DEBUG] tlsutil: Update with version 1
TestJoinCommandJoin_lan - 2019/12/30 19:07:26.697465 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/30 19:07:26.892472 [INFO] agent: Synced node info
TestJoinCommand_wan - 2019/12/30 19:07:27.428356 [DEBUG] agent: Node info in sync
TestJoinCommand_wan - 2019/12/30 19:07:28.477620 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestJoinCommand_wan - 2019/12/30 19:07:28.478302 [DEBUG] consul: Skipping self join check for "Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87" since the cluster is too small
TestJoinCommand_wan - 2019/12/30 19:07:28.478505 [INFO] consul: member 'Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87' joined, marking health alive
2019/12/30 19:07:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fc51c4f5-1474-6cd4-7a68-4f88d0c471b6 Address:127.0.0.1:25018}]
2019/12/30 19:07:28 [INFO]  raft: Node at 127.0.0.1:25018 [Follower] entering Follower state (Leader: "")
2019/12/30 19:07:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3b0b82a1-0b3c-33f1-cec7-56ebe8140111 Address:127.0.0.1:25024}]
TestJoinCommand_wan - 2019/12/30 19:07:28.509622 [INFO] serf: EventMemberJoin: Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6.dc1 127.0.0.1
2019/12/30 19:07:28 [INFO]  raft: Node at 127.0.0.1:25024 [Follower] entering Follower state (Leader: "")
TestJoinCommand_wan - 2019/12/30 19:07:28.541335 [INFO] serf: EventMemberJoin: Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6 127.0.0.1
TestJoinCommand_wan - 2019/12/30 19:07:28.542139 [INFO] consul: Adding LAN server Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6 (Addr: tcp/127.0.0.1:25018) (DC: dc1)
TestJoinCommand_wan - 2019/12/30 19:07:28.542522 [INFO] consul: Handled member-join event for server "Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/30 19:07:28.542909 [INFO] agent: Started DNS server 127.0.0.1:25013 (udp)
TestJoinCommand_wan - 2019/12/30 19:07:28.542977 [INFO] agent: Started DNS server 127.0.0.1:25013 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.544176 [INFO] serf: EventMemberJoin: Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111.dc1 127.0.0.1
2019/12/30 19:07:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:28 [INFO]  raft: Node at 127.0.0.1:25018 [Candidate] entering Candidate state in term 2
TestJoinCommand_wan - 2019/12/30 19:07:28.545609 [INFO] agent: Started HTTP server on 127.0.0.1:25014 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:28.545731 [INFO] agent: started state syncer
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.551779 [INFO] serf: EventMemberJoin: Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.552568 [INFO] consul: Adding LAN server Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111 (Addr: tcp/127.0.0.1:25024) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.553240 [INFO] consul: Handled member-join event for server "Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.554532 [INFO] agent: Started DNS server 127.0.0.1:25019 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.554618 [INFO] agent: Started DNS server 127.0.0.1:25019 (udp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.557119 [INFO] agent: Started HTTP server on 127.0.0.1:25020 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:28.557270 [INFO] agent: started state syncer
2019/12/30 19:07:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:28 [INFO]  raft: Node at 127.0.0.1:25024 [Candidate] entering Candidate state in term 2
TestJoinCommand_wan - 2019/12/30 19:07:29.016379 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.140685 [DEBUG] agent: Node info in sync
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.140816 [DEBUG] agent: Node info in sync
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.235765 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.237007 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.240375 [DEBUG] consul: Skipping self join check for "Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea" since the cluster is too small
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.240571 [INFO] consul: member 'Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea' joined, marking health alive
2019/12/30 19:07:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:29 [INFO]  raft: Node at 127.0.0.1:25018 [Leader] entering Leader state
TestJoinCommand_wan - 2019/12/30 19:07:29.385531 [INFO] consul: cluster leadership acquired
TestJoinCommand_wan - 2019/12/30 19:07:29.386027 [INFO] consul: New leader elected: Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6
2019/12/30 19:07:29 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:29 [INFO]  raft: Node at 127.0.0.1:25024 [Leader] entering Leader state
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.450276 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.450783 [INFO] consul: New leader elected: Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111
TestJoinCommand_wan - 2019/12/30 19:07:29.655309 [INFO] agent: (WAN) joining: [127.0.0.1:25017]
TestJoinCommand_wan - 2019/12/30 19:07:29.656307 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:25017
TestJoinCommand_wan - 2019/12/30 19:07:29.656393 [DEBUG] memberlist: Stream connection from=127.0.0.1:57888
TestJoinCommand_wan - 2019/12/30 19:07:29.660683 [INFO] serf: EventMemberJoin: Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/30 19:07:29.661539 [INFO] consul: Handled member-join event for server "Node f6dc6ad2-dbd5-25a9-d5de-496bd47c6f87.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/30 19:07:29.661806 [INFO] serf: EventMemberJoin: Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/30 19:07:29.662313 [INFO] consul: Handled member-join event for server "Node fc51c4f5-1474-6cd4-7a68-4f88d0c471b6.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/30 19:07:29.663101 [INFO] agent: (WAN) joined: 1
TestJoinCommand_wan - 2019/12/30 19:07:29.663322 [DEBUG] http: Request PUT /v1/agent/join/127.0.0.1:25017?wan=1 (8.018885ms) from=127.0.0.1:56148
TestJoinCommand_wan - 2019/12/30 19:07:29.664058 [INFO] agent: Requesting shutdown
TestJoinCommand_wan - 2019/12/30 19:07:29.664133 [INFO] consul: shutting down server
TestJoinCommand_wan - 2019/12/30 19:07:29.664185 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.759046 [INFO] agent: (LAN) joining: [127.0.0.1:25022]
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.760200 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:25022
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.761744 [DEBUG] memberlist: Stream connection from=127.0.0.1:59310
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.765952 [INFO] serf: EventMemberJoin: Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.766631 [INFO] agent: (LAN) joined: 1
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.767214 [DEBUG] agent: systemd notify failed: No socket
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.767423 [DEBUG] http: Request PUT /v1/agent/join/127.0.0.1:25022 (8.393562ms) from=127.0.0.1:41542
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.767090 [INFO] consul: Adding LAN server Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111 (Addr: tcp/127.0.0.1:25024) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.768028 [ERR] consul: 'Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111' and 'Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.768174 [INFO] consul: member 'Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111' joined, marking health alive
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.768612 [INFO] agent: Requesting shutdown
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.769320 [INFO] consul: shutting down server
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.769610 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.768943 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:25023
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.769192 [DEBUG] memberlist: Stream connection from=127.0.0.1:59920
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.782140 [INFO] serf: EventMemberJoin: Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.793429 [INFO] consul: Handled member-join event for server "Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.801594 [INFO] serf: EventMemberJoin: Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.802175 [DEBUG] consul: Successfully performed flood-join for "Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111" at 127.0.0.1:25023
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.803137 [INFO] consul: Handled member-join event for server "Node 3b0b82a1-0b3c-33f1-cec7-56ebe8140111.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.790971 [INFO] serf: EventMemberJoin: Node 05fdee6d-0e80-f0b0-e1fb-c28f9133feea 127.0.0.1
TestJoinCommand_wan - 2019/12/30 19:07:29.783593 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/30 19:07:29.966904 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/12/30 19:07:29.966995 [INFO] manager: shutting down
TestJoinCommand_wan - 2019/12/30 19:07:29.967056 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestJoinCommand_wan - 2019/12/30 19:07:29.967345 [ERR] consul: failed to establish leadership: raft is already shutdown
TestJoinCommand_wan - 2019/12/30 19:07:29.967561 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestJoinCommand_wan - 2019/12/30 19:07:29.967630 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestJoinCommand_wan - 2019/12/30 19:07:29.967956 [INFO] agent: Synced node info
TestJoinCommand_wan - 2019/12/30 19:07:29.968169 [INFO] agent: consul server down
TestJoinCommand_wan - 2019/12/30 19:07:29.968215 [INFO] agent: shutdown complete
TestJoinCommand_wan - 2019/12/30 19:07:29.968262 [INFO] agent: Stopping DNS server 127.0.0.1:25013 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:29.968405 [INFO] agent: Stopping DNS server 127.0.0.1:25013 (udp)
TestJoinCommand_wan - 2019/12/30 19:07:29.968563 [INFO] agent: Stopping HTTP server 127.0.0.1:25014 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:29.968792 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommand_wan - 2019/12/30 19:07:29.968862 [INFO] agent: Endpoints down
TestJoinCommand_wan - 2019/12/30 19:07:29.968903 [INFO] agent: Requesting shutdown
TestJoinCommand_wan - 2019/12/30 19:07:29.968959 [INFO] consul: shutting down server
TestJoinCommand_wan - 2019/12/30 19:07:29.969005 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.058436 [INFO] manager: shutting down
TestJoinCommand_wan - 2019/12/30 19:07:30.117497 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.117529 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.117617 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.117760 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.117763 [INFO] agent: consul server down
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.117897 [INFO] agent: shutdown complete
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.117956 [INFO] agent: Stopping DNS server 127.0.0.1:25019 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118106 [INFO] agent: Stopping DNS server 127.0.0.1:25019 (udp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118267 [INFO] agent: Stopping HTTP server 127.0.0.1:25020 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118490 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118572 [INFO] agent: Endpoints down
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118617 [INFO] agent: Requesting shutdown
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118666 [INFO] consul: shutting down server
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.118714 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/12/30 19:07:30.182891 [INFO] manager: shutting down
TestJoinCommand_wan - 2019/12/30 19:07:30.183645 [INFO] agent: consul server down
TestJoinCommand_wan - 2019/12/30 19:07:30.183710 [INFO] agent: shutdown complete
TestJoinCommand_wan - 2019/12/30 19:07:30.183769 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:30.183911 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (udp)
TestJoinCommand_wan - 2019/12/30 19:07:30.184082 [INFO] agent: Stopping HTTP server 127.0.0.1:25002 (tcp)
TestJoinCommand_wan - 2019/12/30 19:07:30.184743 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommand_wan - 2019/12/30 19:07:30.184896 [INFO] agent: Endpoints down
--- PASS: TestJoinCommand_wan (6.81s)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.241063 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.316144 [INFO] manager: shutting down
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.316748 [INFO] agent: consul server down
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.316805 [INFO] agent: shutdown complete
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.316861 [INFO] agent: Stopping DNS server 127.0.0.1:25007 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.316995 [INFO] agent: Stopping DNS server 127.0.0.1:25007 (udp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.317139 [INFO] agent: Stopping HTTP server 127.0.0.1:25008 (tcp)
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.317554 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommandJoin_lan - 2019/12/30 19:07:30.317657 [INFO] agent: Endpoints down
--- PASS: TestJoinCommandJoin_lan (6.94s)
PASS
ok  	github.com/hashicorp/consul/command/join	7.528s
=== RUN   TestKeygenCommand_noTabs
=== PAUSE TestKeygenCommand_noTabs
=== RUN   TestKeygenCommand
=== PAUSE TestKeygenCommand
=== CONT  TestKeygenCommand_noTabs
--- PASS: TestKeygenCommand_noTabs (0.00s)
=== CONT  TestKeygenCommand
--- PASS: TestKeygenCommand (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/keygen	0.047s
=== RUN   TestKeyringCommand_noTabs
=== PAUSE TestKeyringCommand_noTabs
=== RUN   TestKeyringCommand
=== PAUSE TestKeyringCommand
=== RUN   TestKeyringCommand_help
=== PAUSE TestKeyringCommand_help
=== RUN   TestKeyringCommand_failedConnection
=== PAUSE TestKeyringCommand_failedConnection
=== RUN   TestKeyringCommand_invalidRelayFactor
=== PAUSE TestKeyringCommand_invalidRelayFactor
=== CONT  TestKeyringCommand_noTabs
=== CONT  TestKeyringCommand_failedConnection
=== CONT  TestKeyringCommand_help
=== CONT  TestKeyringCommand_invalidRelayFactor
--- PASS: TestKeyringCommand_help (0.00s)
=== CONT  TestKeyringCommand
--- PASS: TestKeyringCommand_invalidRelayFactor (0.00s)
--- PASS: TestKeyringCommand_noTabs (0.01s)
--- PASS: TestKeyringCommand_failedConnection (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestKeyringCommand - 2019/12/30 19:07:36.531041 [WARN] agent: Node name "Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKeyringCommand - 2019/12/30 19:07:36.533593 [DEBUG] tlsutil: Update with version 1
TestKeyringCommand - 2019/12/30 19:07:36.539569 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:07:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a Address:127.0.0.1:28006}]
2019/12/30 19:07:37 [INFO]  raft: Node at 127.0.0.1:28006 [Follower] entering Follower state (Leader: "")
TestKeyringCommand - 2019/12/30 19:07:37.745951 [INFO] serf: EventMemberJoin: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1 127.0.0.1
TestKeyringCommand - 2019/12/30 19:07:37.749741 [INFO] serf: EventMemberJoin: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a 127.0.0.1
TestKeyringCommand - 2019/12/30 19:07:37.750938 [INFO] consul: Adding LAN server Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a (Addr: tcp/127.0.0.1:28006) (DC: dc1)
TestKeyringCommand - 2019/12/30 19:07:37.751313 [INFO] consul: Handled member-join event for server "Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1" in area "wan"
TestKeyringCommand - 2019/12/30 19:07:37.751666 [INFO] agent: Started DNS server 127.0.0.1:28001 (udp)
TestKeyringCommand - 2019/12/30 19:07:37.754943 [INFO] agent: Started DNS server 127.0.0.1:28001 (tcp)
TestKeyringCommand - 2019/12/30 19:07:37.759261 [INFO] agent: Started HTTP server on 127.0.0.1:28002 (tcp)
TestKeyringCommand - 2019/12/30 19:07:37.759696 [INFO] agent: started state syncer
2019/12/30 19:07:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:37 [INFO]  raft: Node at 127.0.0.1:28006 [Candidate] entering Candidate state in term 2
2019/12/30 19:07:38 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:38 [INFO]  raft: Node at 127.0.0.1:28006 [Leader] entering Leader state
TestKeyringCommand - 2019/12/30 19:07:38.267696 [INFO] consul: cluster leadership acquired
TestKeyringCommand - 2019/12/30 19:07:38.268214 [INFO] consul: New leader elected: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.336306 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/30 19:07:38.338181 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1
TestKeyringCommand - 2019/12/30 19:07:38.338639 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyringCommand - 2019/12/30 19:07:38.348628 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/30 19:07:38.350362 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.353453 [DEBUG] http: Request GET /v1/operator/keyring (18.054492ms) from=127.0.0.1:45000
TestKeyringCommand - 2019/12/30 19:07:38.384641 [INFO] serf: Received install-key query
TestKeyringCommand - 2019/12/30 19:07:38.392752 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1
TestKeyringCommand - 2019/12/30 19:07:38.405000 [INFO] serf: Received install-key query
TestKeyringCommand - 2019/12/30 19:07:38.410126 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.412046 [DEBUG] http: Request POST /v1/operator/keyring (33.619581ms) from=127.0.0.1:45002
TestKeyringCommand - 2019/12/30 19:07:38.421555 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/30 19:07:38.423004 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1
TestKeyringCommand - 2019/12/30 19:07:38.424232 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/30 19:07:38.425613 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.427283 [DEBUG] http: Request GET /v1/operator/keyring (6.266837ms) from=127.0.0.1:45004
TestKeyringCommand - 2019/12/30 19:07:38.436721 [INFO] serf: Received use-key query
TestKeyringCommand - 2019/12/30 19:07:38.438679 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1
TestKeyringCommand - 2019/12/30 19:07:38.440051 [INFO] serf: Received use-key query
TestKeyringCommand - 2019/12/30 19:07:38.442204 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.448004 [DEBUG] http: Request PUT /v1/operator/keyring (12.207999ms) from=127.0.0.1:45006
TestKeyringCommand - 2019/12/30 19:07:38.453687 [INFO] serf: Received remove-key query
TestKeyringCommand - 2019/12/30 19:07:38.455854 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1
TestKeyringCommand - 2019/12/30 19:07:38.457035 [INFO] serf: Received remove-key query
TestKeyringCommand - 2019/12/30 19:07:38.459007 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.459796 [DEBUG] http: Request DELETE /v1/operator/keyring (6.946523ms) from=127.0.0.1:45008
TestKeyringCommand - 2019/12/30 19:07:38.467288 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/30 19:07:38.471593 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a.dc1
TestKeyringCommand - 2019/12/30 19:07:38.475413 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/30 19:07:38.484721 [DEBUG] serf: messageQueryResponseType: Node f2b7c1ec-d5d4-d2ed-2163-e4efbc84ee6a
TestKeyringCommand - 2019/12/30 19:07:38.486835 [DEBUG] http: Request GET /v1/operator/keyring (20.882235ms) from=127.0.0.1:45010
TestKeyringCommand - 2019/12/30 19:07:38.488970 [INFO] agent: Requesting shutdown
TestKeyringCommand - 2019/12/30 19:07:38.489048 [INFO] consul: shutting down server
TestKeyringCommand - 2019/12/30 19:07:38.489097 [WARN] serf: Shutdown without a Leave
TestKeyringCommand - 2019/12/30 19:07:38.490108 [ERR] agent: failed to sync remote state: No cluster leader
TestKeyringCommand - 2019/12/30 19:07:38.591279 [WARN] serf: Shutdown without a Leave
TestKeyringCommand - 2019/12/30 19:07:38.666435 [INFO] manager: shutting down
TestKeyringCommand - 2019/12/30 19:07:38.908920 [INFO] agent: consul server down
TestKeyringCommand - 2019/12/30 19:07:38.909041 [INFO] agent: shutdown complete
TestKeyringCommand - 2019/12/30 19:07:38.909130 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (tcp)
TestKeyringCommand - 2019/12/30 19:07:38.909339 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (udp)
TestKeyringCommand - 2019/12/30 19:07:38.909714 [INFO] agent: Stopping HTTP server 127.0.0.1:28002 (tcp)
TestKeyringCommand - 2019/12/30 19:07:38.910020 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKeyringCommand - 2019/12/30 19:07:38.910804 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKeyringCommand - 2019/12/30 19:07:38.911090 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKeyringCommand - 2019/12/30 19:07:38.911198 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKeyringCommand - 2019/12/30 19:07:38.911765 [INFO] agent: Waiting for endpoints to shut down
TestKeyringCommand - 2019/12/30 19:07:38.911966 [INFO] agent: Endpoints down
--- PASS: TestKeyringCommand (2.47s)
PASS
ok  	github.com/hashicorp/consul/command/keyring	2.758s
=== RUN   TestKVCommand_noTabs
=== PAUSE TestKVCommand_noTabs
=== CONT  TestKVCommand_noTabs
--- PASS: TestKVCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/kv	0.130s
=== RUN   TestKVDeleteCommand_noTabs
=== PAUSE TestKVDeleteCommand_noTabs
=== RUN   TestKVDeleteCommand_Validation
=== PAUSE TestKVDeleteCommand_Validation
=== RUN   TestKVDeleteCommand
=== PAUSE TestKVDeleteCommand
=== RUN   TestKVDeleteCommand_Recurse
=== PAUSE TestKVDeleteCommand_Recurse
=== RUN   TestKVDeleteCommand_CAS
=== PAUSE TestKVDeleteCommand_CAS
=== CONT  TestKVDeleteCommand_noTabs
=== CONT  TestKVDeleteCommand_CAS
=== CONT  TestKVDeleteCommand_Recurse
=== CONT  TestKVDeleteCommand
=== CONT  TestKVDeleteCommand_Validation
--- PASS: TestKVDeleteCommand_noTabs (0.01s)
--- PASS: TestKVDeleteCommand_Validation (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand - 2019/12/30 19:07:44.592488 [WARN] agent: Node name "Node dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:44.592812 [WARN] agent: Node name "Node 880a4a28-5f36-7969-b192-49da8aaddcdd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:44.593290 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand - 2019/12/30 19:07:44.593290 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand - 2019/12/30 19:07:44.610685 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:44.616803 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand_CAS - 2019/12/30 19:07:44.635099 [WARN] agent: Node name "Node 3912064c-ef8c-99e6-9294-1fd1ab9d58e9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_CAS - 2019/12/30 19:07:44.635601 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand_CAS - 2019/12/30 19:07:44.637908 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:07:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708 Address:127.0.0.1:53506}]
2019/12/30 19:07:45 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestKVDeleteCommand - 2019/12/30 19:07:45.505756 [INFO] serf: EventMemberJoin: Node dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708.dc1 127.0.0.1
TestKVDeleteCommand - 2019/12/30 19:07:45.509234 [INFO] serf: EventMemberJoin: Node dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708 127.0.0.1
TestKVDeleteCommand - 2019/12/30 19:07:45.512327 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestKVDeleteCommand - 2019/12/30 19:07:45.513228 [INFO] consul: Handled member-join event for server "Node dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708.dc1" in area "wan"
TestKVDeleteCommand - 2019/12/30 19:07:45.513315 [INFO] consul: Adding LAN server Node dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708 (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestKVDeleteCommand - 2019/12/30 19:07:45.513610 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestKVDeleteCommand - 2019/12/30 19:07:45.516736 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestKVDeleteCommand - 2019/12/30 19:07:45.516894 [INFO] agent: started state syncer
2019/12/30 19:07:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:45 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
2019/12/30 19:07:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3912064c-ef8c-99e6-9294-1fd1ab9d58e9 Address:127.0.0.1:53512}]
2019/12/30 19:07:45 [INFO]  raft: Node at 127.0.0.1:53512 [Follower] entering Follower state (Leader: "")
2019/12/30 19:07:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:880a4a28-5f36-7969-b192-49da8aaddcdd Address:127.0.0.1:53518}]
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.580064 [INFO] serf: EventMemberJoin: Node 3912064c-ef8c-99e6-9294-1fd1ab9d58e9.dc1 127.0.0.1
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.580198 [INFO] serf: EventMemberJoin: Node 880a4a28-5f36-7969-b192-49da8aaddcdd.dc1 127.0.0.1
2019/12/30 19:07:45 [INFO]  raft: Node at 127.0.0.1:53518 [Follower] entering Follower state (Leader: "")
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.583603 [INFO] serf: EventMemberJoin: Node 3912064c-ef8c-99e6-9294-1fd1ab9d58e9 127.0.0.1
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.584784 [INFO] serf: EventMemberJoin: Node 880a4a28-5f36-7969-b192-49da8aaddcdd 127.0.0.1
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.586219 [INFO] consul: Adding LAN server Node 880a4a28-5f36-7969-b192-49da8aaddcdd (Addr: tcp/127.0.0.1:53518) (DC: dc1)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.586551 [INFO] consul: Handled member-join event for server "Node 880a4a28-5f36-7969-b192-49da8aaddcdd.dc1" in area "wan"
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.589885 [INFO] consul: Handled member-join event for server "Node 3912064c-ef8c-99e6-9294-1fd1ab9d58e9.dc1" in area "wan"
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.590248 [INFO] consul: Adding LAN server Node 3912064c-ef8c-99e6-9294-1fd1ab9d58e9 (Addr: tcp/127.0.0.1:53512) (DC: dc1)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.633098 [INFO] agent: Started DNS server 127.0.0.1:53513 (udp)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.640149 [INFO] agent: Started DNS server 127.0.0.1:53513 (tcp)
2019/12/30 19:07:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:45 [INFO]  raft: Node at 127.0.0.1:53512 [Candidate] entering Candidate state in term 2
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.644829 [INFO] agent: Started HTTP server on 127.0.0.1:53514 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:45.644949 [INFO] agent: started state syncer
2019/12/30 19:07:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:45 [INFO]  raft: Node at 127.0.0.1:53518 [Candidate] entering Candidate state in term 2
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.648262 [INFO] agent: Started DNS server 127.0.0.1:53507 (udp)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.656100 [INFO] agent: Started DNS server 127.0.0.1:53507 (tcp)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.662282 [INFO] agent: Started HTTP server on 127.0.0.1:53508 (tcp)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:45.662462 [INFO] agent: started state syncer
2019/12/30 19:07:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:46 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestKVDeleteCommand - 2019/12/30 19:07:46.143583 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand - 2019/12/30 19:07:46.144134 [INFO] consul: New leader elected: Node dfa9cf5f-12f5-d69d-ddd1-cf04d70c1708
2019/12/30 19:07:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:46 [INFO]  raft: Node at 127.0.0.1:53518 [Leader] entering Leader state
2019/12/30 19:07:46 [INFO]  raft: Node at 127.0.0.1:53512 [Leader] entering Leader state
TestKVDeleteCommand_CAS - 2019/12/30 19:07:46.243347 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand_CAS - 2019/12/30 19:07:46.243829 [INFO] consul: New leader elected: Node 3912064c-ef8c-99e6-9294-1fd1ab9d58e9
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:46.244114 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:46.244547 [INFO] consul: New leader elected: Node 880a4a28-5f36-7969-b192-49da8aaddcdd
TestKVDeleteCommand - 2019/12/30 19:07:46.493048 [INFO] agent: Synced node info
TestKVDeleteCommand - 2019/12/30 19:07:46.493170 [DEBUG] agent: Node info in sync
TestKVDeleteCommand - 2019/12/30 19:07:46.495337 [DEBUG] http: Request PUT /v1/kv/foo (323.171125ms) from=127.0.0.1:38456
TestKVDeleteCommand_CAS - 2019/12/30 19:07:46.575905 [INFO] agent: Synced node info
TestKVDeleteCommand_CAS - 2019/12/30 19:07:46.577193 [DEBUG] http: Request PUT /v1/kv/foo (198.927412ms) from=127.0.0.1:40082
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:46.752680 [DEBUG] http: Request PUT /v1/kv/foo/a (181.369267ms) from=127.0.0.1:57102
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:46.757257 [INFO] agent: Synced node info
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:46.955890 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:46.956016 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_CAS - 2019/12/30 19:07:46.993897 [DEBUG] http: Request DELETE /v1/kv/foo?cas=1 (392.457009ms) from=127.0.0.1:40088
TestKVDeleteCommand_CAS - 2019/12/30 19:07:46.997290 [DEBUG] http: Request GET /v1/kv/foo (1.384705ms) from=127.0.0.1:40082
TestKVDeleteCommand - 2019/12/30 19:07:47.068688 [DEBUG] http: Request DELETE /v1/kv/foo (569.601161ms) from=127.0.0.1:38460
TestKVDeleteCommand - 2019/12/30 19:07:47.071325 [DEBUG] http: Request GET /v1/kv/foo (239.34µs) from=127.0.0.1:38456
TestKVDeleteCommand - 2019/12/30 19:07:47.071952 [INFO] agent: Requesting shutdown
TestKVDeleteCommand - 2019/12/30 19:07:47.072019 [INFO] consul: shutting down server
TestKVDeleteCommand - 2019/12/30 19:07:47.072060 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand - 2019/12/30 19:07:47.141533 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.149918 [DEBUG] http: Request PUT /v1/kv/foo/b (393.525038ms) from=127.0.0.1:57102
TestKVDeleteCommand - 2019/12/30 19:07:47.216598 [INFO] manager: shutting down
TestKVDeleteCommand - 2019/12/30 19:07:47.217700 [INFO] agent: consul server down
TestKVDeleteCommand - 2019/12/30 19:07:47.217769 [INFO] agent: shutdown complete
TestKVDeleteCommand - 2019/12/30 19:07:47.217828 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestKVDeleteCommand - 2019/12/30 19:07:47.217978 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestKVDeleteCommand - 2019/12/30 19:07:47.218141 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestKVDeleteCommand - 2019/12/30 19:07:47.218931 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand - 2019/12/30 19:07:47.219011 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand (2.78s)
TestKVDeleteCommand - 2019/12/30 19:07:47.243059 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.294948 [DEBUG] http: Request DELETE /v1/kv/foo?cas=5 (293.236309ms) from=127.0.0.1:40090
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.297184 [DEBUG] http: Request GET /v1/kv/foo (190.338µs) from=127.0.0.1:40082
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.297749 [INFO] agent: Requesting shutdown
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.297829 [INFO] consul: shutting down server
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.297873 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.358215 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.424913 [INFO] manager: shutting down
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.425763 [INFO] agent: consul server down
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.425819 [INFO] agent: shutdown complete
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.425869 [INFO] agent: Stopping DNS server 127.0.0.1:53507 (tcp)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.426015 [INFO] agent: Stopping DNS server 127.0.0.1:53507 (udp)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.426228 [INFO] agent: Stopping HTTP server 127.0.0.1:53508 (tcp)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.427006 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.427091 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand_CAS (2.99s)
TestKVDeleteCommand_CAS - 2019/12/30 19:07:47.444187 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.584551 [DEBUG] http: Request PUT /v1/kv/food (432.563433ms) from=127.0.0.1:57102
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.793404 [DEBUG] http: Request DELETE /v1/kv/foo?recurse= (205.804598ms) from=127.0.0.1:57108
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.795915 [DEBUG] http: Request GET /v1/kv/foo/a (210.339µs) from=127.0.0.1:57102
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.797224 [DEBUG] http: Request GET /v1/kv/foo/b (179.005µs) from=127.0.0.1:57102
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.798521 [DEBUG] http: Request GET /v1/kv/food (201.005µs) from=127.0.0.1:57102
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.799143 [INFO] agent: Requesting shutdown
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.799212 [INFO] consul: shutting down server
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.799258 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:47.933216 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.076334 [INFO] manager: shutting down
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208276 [ERR] connect: Apply failed leadership lost while committing log
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208372 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208501 [INFO] agent: consul server down
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208546 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208611 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208694 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208746 [ERR] consul: failed to transfer leadership in 3 attempts
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208555 [INFO] agent: shutdown complete
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.208866 [INFO] agent: Stopping DNS server 127.0.0.1:53513 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.209064 [INFO] agent: Stopping DNS server 127.0.0.1:53513 (udp)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.209250 [INFO] agent: Stopping HTTP server 127.0.0.1:53514 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.210192 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand_Recurse - 2019/12/30 19:07:48.210474 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand_Recurse (3.78s)
PASS
ok  	github.com/hashicorp/consul/command/kv/del	4.186s
=== RUN   TestKVExportCommand_noTabs
=== PAUSE TestKVExportCommand_noTabs
=== RUN   TestKVExportCommand
=== PAUSE TestKVExportCommand
=== CONT  TestKVExportCommand_noTabs
=== CONT  TestKVExportCommand
--- PASS: TestKVExportCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVExportCommand - 2019/12/30 19:07:47.949950 [WARN] agent: Node name "Node 95c4fb37-49cf-f6b1-7fcb-a1144ad25930" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVExportCommand - 2019/12/30 19:07:47.950951 [DEBUG] tlsutil: Update with version 1
TestKVExportCommand - 2019/12/30 19:07:47.962342 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:07:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:95c4fb37-49cf-f6b1-7fcb-a1144ad25930 Address:127.0.0.1:23506}]
2019/12/30 19:07:48 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestKVExportCommand - 2019/12/30 19:07:48.955378 [INFO] serf: EventMemberJoin: Node 95c4fb37-49cf-f6b1-7fcb-a1144ad25930.dc1 127.0.0.1
TestKVExportCommand - 2019/12/30 19:07:48.959374 [INFO] serf: EventMemberJoin: Node 95c4fb37-49cf-f6b1-7fcb-a1144ad25930 127.0.0.1
TestKVExportCommand - 2019/12/30 19:07:48.960882 [INFO] consul: Adding LAN server Node 95c4fb37-49cf-f6b1-7fcb-a1144ad25930 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestKVExportCommand - 2019/12/30 19:07:48.961727 [INFO] consul: Handled member-join event for server "Node 95c4fb37-49cf-f6b1-7fcb-a1144ad25930.dc1" in area "wan"
TestKVExportCommand - 2019/12/30 19:07:48.963571 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestKVExportCommand - 2019/12/30 19:07:48.964318 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestKVExportCommand - 2019/12/30 19:07:48.967192 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestKVExportCommand - 2019/12/30 19:07:48.967370 [INFO] agent: started state syncer
2019/12/30 19:07:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:07:49 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 19:07:49 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:07:49 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestKVExportCommand - 2019/12/30 19:07:49.426965 [INFO] consul: cluster leadership acquired
TestKVExportCommand - 2019/12/30 19:07:49.427478 [INFO] consul: New leader elected: Node 95c4fb37-49cf-f6b1-7fcb-a1144ad25930
TestKVExportCommand - 2019/12/30 19:07:49.742744 [INFO] agent: Synced node info
TestKVExportCommand - 2019/12/30 19:07:49.748741 [DEBUG] http: Request PUT /v1/kv/foo/a (223.361409ms) from=127.0.0.1:38094
TestKVExportCommand - 2019/12/30 19:07:50.127563 [DEBUG] http: Request PUT /v1/kv/foo/b (364.989593ms) from=127.0.0.1:38094
TestKVExportCommand - 2019/12/30 19:07:50.625208 [DEBUG] http: Request PUT /v1/kv/foo/c (495.369139ms) from=127.0.0.1:38094
TestKVExportCommand - 2019/12/30 19:07:50.778260 [DEBUG] http: Request PUT /v1/kv/bar (146.422982ms) from=127.0.0.1:38094
TestKVExportCommand - 2019/12/30 19:07:50.786439 [DEBUG] http: Request GET /v1/kv/foo?recurse= (4.059444ms) from=127.0.0.1:38096
TestKVExportCommand - 2019/12/30 19:07:50.800536 [INFO] agent: Requesting shutdown
TestKVExportCommand - 2019/12/30 19:07:50.800643 [INFO] consul: shutting down server
TestKVExportCommand - 2019/12/30 19:07:50.800691 [WARN] serf: Shutdown without a Leave
TestKVExportCommand - 2019/12/30 19:07:50.916562 [WARN] serf: Shutdown without a Leave
TestKVExportCommand - 2019/12/30 19:07:51.041666 [INFO] manager: shutting down
TestKVExportCommand - 2019/12/30 19:07:51.166604 [ERR] connect: Apply failed leadership lost while committing log
TestKVExportCommand - 2019/12/30 19:07:51.166715 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVExportCommand - 2019/12/30 19:07:51.166853 [INFO] agent: consul server down
TestKVExportCommand - 2019/12/30 19:07:51.166892 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVExportCommand - 2019/12/30 19:07:51.166966 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVExportCommand - 2019/12/30 19:07:51.167021 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestKVExportCommand - 2019/12/30 19:07:51.167090 [ERR] consul: failed to transfer leadership in 3 attempts
TestKVExportCommand - 2019/12/30 19:07:51.166900 [INFO] agent: shutdown complete
TestKVExportCommand - 2019/12/30 19:07:51.167218 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestKVExportCommand - 2019/12/30 19:07:51.167359 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestKVExportCommand - 2019/12/30 19:07:51.167494 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestKVExportCommand - 2019/12/30 19:07:51.168135 [INFO] agent: Waiting for endpoints to shut down
TestKVExportCommand - 2019/12/30 19:07:51.168278 [INFO] agent: Endpoints down
--- PASS: TestKVExportCommand (3.32s)
PASS
ok  	github.com/hashicorp/consul/command/kv/exp	3.595s
=== RUN   TestKVGetCommand_noTabs
=== PAUSE TestKVGetCommand_noTabs
=== RUN   TestKVGetCommand_Validation
=== PAUSE TestKVGetCommand_Validation
=== RUN   TestKVGetCommand
=== PAUSE TestKVGetCommand
=== RUN   TestKVGetCommand_Base64
=== PAUSE TestKVGetCommand_Base64
=== RUN   TestKVGetCommand_Missing
=== PAUSE TestKVGetCommand_Missing
=== RUN   TestKVGetCommand_Empty
=== PAUSE TestKVGetCommand_Empty
=== RUN   TestKVGetCommand_Detailed
=== PAUSE TestKVGetCommand_Detailed
=== RUN   TestKVGetCommand_Keys
=== PAUSE TestKVGetCommand_Keys
=== RUN   TestKVGetCommand_Recurse
=== PAUSE TestKVGetCommand_Recurse
=== RUN   TestKVGetCommand_RecurseBase64
=== PAUSE TestKVGetCommand_RecurseBase64
=== RUN   TestKVGetCommand_DetailedBase64
--- SKIP: TestKVGetCommand_DetailedBase64 (0.00s)
    kv_get_test.go:338: DM-skipped
=== CONT  TestKVGetCommand_noTabs
=== CONT  TestKVGetCommand_Detailed
=== CONT  TestKVGetCommand_Empty
=== CONT  TestKVGetCommand_Base64
=== CONT  TestKVGetCommand_Missing
--- PASS: TestKVGetCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Empty - 2019/12/30 19:08:18.763716 [WARN] agent: Node name "Node d6de03a9-c870-c1d8-930f-0725ad1531c5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Empty - 2019/12/30 19:08:18.764535 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Empty - 2019/12/30 19:08:18.785335 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Base64 - 2019/12/30 19:08:18.805974 [WARN] agent: Node name "Node a24bece6-7005-b021-90b6-4ac56223d8cd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Base64 - 2019/12/30 19:08:18.806503 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Base64 - 2019/12/30 19:08:18.809867 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Detailed - 2019/12/30 19:08:18.838800 [WARN] agent: Node name "Node ffd36984-de55-3eb2-f129-446b2bab4a10" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Detailed - 2019/12/30 19:08:18.839794 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Detailed - 2019/12/30 19:08:18.843054 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Missing - 2019/12/30 19:08:18.844646 [WARN] agent: Node name "Node d84618b6-e331-6631-5e74-298eb815225b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Missing - 2019/12/30 19:08:18.845258 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Missing - 2019/12/30 19:08:18.848364 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:08:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d6de03a9-c870-c1d8-930f-0725ad1531c5 Address:127.0.0.1:20518}]
2019/12/30 19:08:19 [INFO]  raft: Node at 127.0.0.1:20518 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Empty - 2019/12/30 19:08:19.842828 [INFO] serf: EventMemberJoin: Node d6de03a9-c870-c1d8-930f-0725ad1531c5.dc1 127.0.0.1
TestKVGetCommand_Empty - 2019/12/30 19:08:19.849171 [INFO] serf: EventMemberJoin: Node d6de03a9-c870-c1d8-930f-0725ad1531c5 127.0.0.1
TestKVGetCommand_Empty - 2019/12/30 19:08:19.853628 [INFO] consul: Adding LAN server Node d6de03a9-c870-c1d8-930f-0725ad1531c5 (Addr: tcp/127.0.0.1:20518) (DC: dc1)
TestKVGetCommand_Empty - 2019/12/30 19:08:19.854057 [INFO] consul: Handled member-join event for server "Node d6de03a9-c870-c1d8-930f-0725ad1531c5.dc1" in area "wan"
TestKVGetCommand_Empty - 2019/12/30 19:08:19.854569 [INFO] agent: Started DNS server 127.0.0.1:20513 (tcp)
TestKVGetCommand_Empty - 2019/12/30 19:08:19.855035 [INFO] agent: Started DNS server 127.0.0.1:20513 (udp)
TestKVGetCommand_Empty - 2019/12/30 19:08:19.857880 [INFO] agent: Started HTTP server on 127.0.0.1:20514 (tcp)
TestKVGetCommand_Empty - 2019/12/30 19:08:19.858036 [INFO] agent: started state syncer
2019/12/30 19:08:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:19 [INFO]  raft: Node at 127.0.0.1:20518 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a24bece6-7005-b021-90b6-4ac56223d8cd Address:127.0.0.1:20506}]
2019/12/30 19:08:19 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.946860 [INFO] serf: EventMemberJoin: Node a24bece6-7005-b021-90b6-4ac56223d8cd.dc1 127.0.0.1
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.951026 [INFO] serf: EventMemberJoin: Node a24bece6-7005-b021-90b6-4ac56223d8cd 127.0.0.1
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.952220 [INFO] consul: Handled member-join event for server "Node a24bece6-7005-b021-90b6-4ac56223d8cd.dc1" in area "wan"
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.952394 [INFO] consul: Adding LAN server Node a24bece6-7005-b021-90b6-4ac56223d8cd (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.957855 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.958176 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.960811 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestKVGetCommand_Base64 - 2019/12/30 19:08:19.960989 [INFO] agent: started state syncer
2019/12/30 19:08:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:20 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d84618b6-e331-6631-5e74-298eb815225b Address:127.0.0.1:20524}]
2019/12/30 19:08:20 [INFO]  raft: Node at 127.0.0.1:20524 [Follower] entering Follower state (Leader: "")
2019/12/30 19:08:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ffd36984-de55-3eb2-f129-446b2bab4a10 Address:127.0.0.1:20512}]
2019/12/30 19:08:20 [INFO]  raft: Node at 127.0.0.1:20512 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Missing - 2019/12/30 19:08:20.401808 [INFO] serf: EventMemberJoin: Node d84618b6-e331-6631-5e74-298eb815225b.dc1 127.0.0.1
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.404833 [INFO] serf: EventMemberJoin: Node ffd36984-de55-3eb2-f129-446b2bab4a10.dc1 127.0.0.1
2019/12/30 19:08:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:20 [INFO]  raft: Node at 127.0.0.1:20524 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Missing - 2019/12/30 19:08:20.449958 [INFO] serf: EventMemberJoin: Node d84618b6-e331-6631-5e74-298eb815225b 127.0.0.1
TestKVGetCommand_Missing - 2019/12/30 19:08:20.452354 [INFO] consul: Adding LAN server Node d84618b6-e331-6631-5e74-298eb815225b (Addr: tcp/127.0.0.1:20524) (DC: dc1)
TestKVGetCommand_Missing - 2019/12/30 19:08:20.452597 [INFO] consul: Handled member-join event for server "Node d84618b6-e331-6631-5e74-298eb815225b.dc1" in area "wan"
TestKVGetCommand_Missing - 2019/12/30 19:08:20.454351 [INFO] agent: Started DNS server 127.0.0.1:20519 (tcp)
2019/12/30 19:08:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:20 [INFO]  raft: Node at 127.0.0.1:20512 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Missing - 2019/12/30 19:08:20.456339 [INFO] agent: Started DNS server 127.0.0.1:20519 (udp)
TestKVGetCommand_Missing - 2019/12/30 19:08:20.462145 [INFO] agent: Started HTTP server on 127.0.0.1:20520 (tcp)
TestKVGetCommand_Missing - 2019/12/30 19:08:20.462558 [INFO] agent: started state syncer
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.466188 [INFO] serf: EventMemberJoin: Node ffd36984-de55-3eb2-f129-446b2bab4a10 127.0.0.1
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.481318 [INFO] agent: Started DNS server 127.0.0.1:20507 (udp)
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.486173 [INFO] consul: Adding LAN server Node ffd36984-de55-3eb2-f129-446b2bab4a10 (Addr: tcp/127.0.0.1:20512) (DC: dc1)
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.486467 [INFO] consul: Handled member-join event for server "Node ffd36984-de55-3eb2-f129-446b2bab4a10.dc1" in area "wan"
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.488423 [INFO] agent: Started DNS server 127.0.0.1:20507 (tcp)
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.492976 [INFO] agent: Started HTTP server on 127.0.0.1:20508 (tcp)
TestKVGetCommand_Detailed - 2019/12/30 19:08:20.493132 [INFO] agent: started state syncer
2019/12/30 19:08:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:21 [INFO]  raft: Node at 127.0.0.1:20518 [Leader] entering Leader state
TestKVGetCommand_Empty - 2019/12/30 19:08:21.437133 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Empty - 2019/12/30 19:08:21.437675 [INFO] consul: New leader elected: Node d6de03a9-c870-c1d8-930f-0725ad1531c5
2019/12/30 19:08:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:21 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestKVGetCommand_Base64 - 2019/12/30 19:08:21.544347 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Base64 - 2019/12/30 19:08:21.544878 [INFO] consul: New leader elected: Node a24bece6-7005-b021-90b6-4ac56223d8cd
2019/12/30 19:08:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:21 [INFO]  raft: Node at 127.0.0.1:20524 [Leader] entering Leader state
2019/12/30 19:08:21 [INFO]  raft: Node at 127.0.0.1:20512 [Leader] entering Leader state
TestKVGetCommand_Detailed - 2019/12/30 19:08:21.726340 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Missing - 2019/12/30 19:08:21.726430 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Detailed - 2019/12/30 19:08:21.726757 [INFO] consul: New leader elected: Node ffd36984-de55-3eb2-f129-446b2bab4a10
TestKVGetCommand_Missing - 2019/12/30 19:08:21.726758 [INFO] consul: New leader elected: Node d84618b6-e331-6631-5e74-298eb815225b
TestKVGetCommand_Empty - 2019/12/30 19:08:21.835766 [INFO] agent: Synced node info
TestKVGetCommand_Empty - 2019/12/30 19:08:21.839049 [DEBUG] http: Request PUT /v1/kv/empty (321.898071ms) from=127.0.0.1:42304
TestKVGetCommand_Empty - 2019/12/30 19:08:21.847939 [DEBUG] http: Request GET /v1/kv/empty (4.690794ms) from=127.0.0.1:42308
TestKVGetCommand_Empty - 2019/12/30 19:08:21.851365 [INFO] agent: Requesting shutdown
TestKVGetCommand_Empty - 2019/12/30 19:08:21.851483 [INFO] consul: shutting down server
TestKVGetCommand_Empty - 2019/12/30 19:08:21.851546 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Missing - 2019/12/30 19:08:21.865336 [DEBUG] http: Request GET /v1/kv/not-a-real-key (470.68µs) from=127.0.0.1:59132
TestKVGetCommand_Missing - 2019/12/30 19:08:21.870273 [INFO] agent: Requesting shutdown
TestKVGetCommand_Missing - 2019/12/30 19:08:21.870639 [INFO] consul: shutting down server
TestKVGetCommand_Missing - 2019/12/30 19:08:21.870912 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/12/30 19:08:21.942368 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Base64 - 2019/12/30 19:08:21.951822 [DEBUG] http: Request PUT /v1/kv/foo (302.462543ms) from=127.0.0.1:33480
TestKVGetCommand_Base64 - 2019/12/30 19:08:21.980951 [INFO] agent: Synced node info
TestKVGetCommand_Base64 - 2019/12/30 19:08:21.981191 [DEBUG] agent: Node info in sync
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.004313 [DEBUG] http: Request GET /v1/kv/foo (4.643126ms) from=127.0.0.1:33486
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.006420 [INFO] agent: Requesting shutdown
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.006525 [INFO] consul: shutting down server
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.006577 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Missing - 2019/12/30 19:08:22.050708 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/12/30 19:08:22.051436 [INFO] manager: shutting down
TestKVGetCommand_Empty - 2019/12/30 19:08:22.053202 [INFO] agent: consul server down
TestKVGetCommand_Empty - 2019/12/30 19:08:22.053281 [INFO] agent: shutdown complete
TestKVGetCommand_Empty - 2019/12/30 19:08:22.053338 [INFO] agent: Stopping DNS server 127.0.0.1:20513 (tcp)
TestKVGetCommand_Empty - 2019/12/30 19:08:22.053483 [INFO] agent: Stopping DNS server 127.0.0.1:20513 (udp)
TestKVGetCommand_Empty - 2019/12/30 19:08:22.053633 [INFO] agent: Stopping HTTP server 127.0.0.1:20514 (tcp)
TestKVGetCommand_Empty - 2019/12/30 19:08:22.054289 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Empty - 2019/12/30 19:08:22.054491 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVGetCommand_Empty - 2019/12/30 19:08:22.055012 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Empty - 2019/12/30 19:08:22.055198 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Empty (3.50s)
=== CONT  TestKVGetCommand
TestKVGetCommand_Empty - 2019/12/30 19:08:22.058016 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand_Empty - 2019/12/30 19:08:22.058115 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.117306 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.140022 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand - 2019/12/30 19:08:22.156420 [WARN] agent: Node name "Node d3f948cb-8979-0d54-8803-701a7d641862" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand - 2019/12/30 19:08:22.156856 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand - 2019/12/30 19:08:22.159110 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Missing - 2019/12/30 19:08:22.210437 [INFO] manager: shutting down
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.210770 [INFO] manager: shutting down
TestKVGetCommand_Missing - 2019/12/30 19:08:22.211347 [INFO] agent: consul server down
TestKVGetCommand_Missing - 2019/12/30 19:08:22.211413 [INFO] agent: shutdown complete
TestKVGetCommand_Missing - 2019/12/30 19:08:22.211476 [INFO] agent: Stopping DNS server 127.0.0.1:20519 (tcp)
TestKVGetCommand_Missing - 2019/12/30 19:08:22.211622 [INFO] agent: Stopping DNS server 127.0.0.1:20519 (udp)
TestKVGetCommand_Missing - 2019/12/30 19:08:22.211797 [INFO] agent: Stopping HTTP server 127.0.0.1:20520 (tcp)
TestKVGetCommand_Missing - 2019/12/30 19:08:22.212180 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Missing - 2019/12/30 19:08:22.212263 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestKVGetCommand_Missing - 2019/12/30 19:08:22.212346 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestKVGetCommand_Missing - 2019/12/30 19:08:22.212388 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestKVGetCommand_Missing - 2019/12/30 19:08:22.212554 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Missing (3.64s)
=== CONT  TestKVGetCommand_Validation
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.214928 [INFO] agent: consul server down
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.214989 [INFO] agent: shutdown complete
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.215057 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.215207 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.215357 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.215986 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.216082 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.216332 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.216493 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Base64 (3.66s)
=== CONT  TestKVGetCommand_RecurseBase64
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.218541 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.218608 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.218667 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/30 19:08:22.218711 [ERR] consul: failed to transfer leadership in 3 attempts
--- PASS: TestKVGetCommand_Validation (0.01s)
=== CONT  TestKVGetCommand_Recurse
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:22.301028 [WARN] agent: Node name "Node eab7861f-25c9-623c-abef-cebb0f5f1108" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:22.301375 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:22.303541 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Recurse - 2019/12/30 19:08:22.332290 [WARN] agent: Node name "Node 48ced1d3-ece0-27ae-67d9-0c41caa532af" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Recurse - 2019/12/30 19:08:22.332780 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Recurse - 2019/12/30 19:08:22.336050 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.462149 [DEBUG] http: Request PUT /v1/kv/foo (433.92111ms) from=127.0.0.1:53234
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.468163 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.468283 [DEBUG] agent: Node info in sync
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.468602 [DEBUG] http: Request GET /v1/kv/foo (1.179032ms) from=127.0.0.1:53236
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.470773 [INFO] agent: Requesting shutdown
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.470854 [INFO] consul: shutting down server
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.470899 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.542313 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.625858 [INFO] manager: shutting down
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.627195 [INFO] agent: consul server down
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.627263 [INFO] agent: shutdown complete
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.627320 [INFO] agent: Stopping DNS server 127.0.0.1:20507 (tcp)
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.627465 [INFO] agent: Stopping DNS server 127.0.0.1:20507 (udp)
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.627620 [INFO] agent: Stopping HTTP server 127.0.0.1:20508 (tcp)
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.628307 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.628416 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.628726 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.628940 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Detailed (4.07s)
=== CONT  TestKVGetCommand_Keys
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.629604 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand_Detailed - 2019/12/30 19:08:22.629688 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Keys - 2019/12/30 19:08:22.755420 [WARN] agent: Node name "Node f32f71b3-1df8-d58d-fca8-866b8b424f86" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Keys - 2019/12/30 19:08:22.756149 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Keys - 2019/12/30 19:08:22.766011 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:08:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d3f948cb-8979-0d54-8803-701a7d641862 Address:127.0.0.1:20530}]
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20530 [Follower] entering Follower state (Leader: "")
TestKVGetCommand - 2019/12/30 19:08:23.114513 [INFO] serf: EventMemberJoin: Node d3f948cb-8979-0d54-8803-701a7d641862.dc1 127.0.0.1
TestKVGetCommand - 2019/12/30 19:08:23.121931 [INFO] serf: EventMemberJoin: Node d3f948cb-8979-0d54-8803-701a7d641862 127.0.0.1
TestKVGetCommand - 2019/12/30 19:08:23.124015 [INFO] consul: Adding LAN server Node d3f948cb-8979-0d54-8803-701a7d641862 (Addr: tcp/127.0.0.1:20530) (DC: dc1)
TestKVGetCommand - 2019/12/30 19:08:23.124246 [INFO] consul: Handled member-join event for server "Node d3f948cb-8979-0d54-8803-701a7d641862.dc1" in area "wan"
TestKVGetCommand - 2019/12/30 19:08:23.125149 [INFO] agent: Started DNS server 127.0.0.1:20525 (tcp)
TestKVGetCommand - 2019/12/30 19:08:23.125455 [INFO] agent: Started DNS server 127.0.0.1:20525 (udp)
TestKVGetCommand - 2019/12/30 19:08:23.127767 [INFO] agent: Started HTTP server on 127.0.0.1:20526 (tcp)
TestKVGetCommand - 2019/12/30 19:08:23.127943 [INFO] agent: started state syncer
2019/12/30 19:08:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20530 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:48ced1d3-ece0-27ae-67d9-0c41caa532af Address:127.0.0.1:20542}]
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20542 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.272421 [INFO] serf: EventMemberJoin: Node 48ced1d3-ece0-27ae-67d9-0c41caa532af.dc1 127.0.0.1
2019/12/30 19:08:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eab7861f-25c9-623c-abef-cebb0f5f1108 Address:127.0.0.1:20536}]
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20536 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.278291 [INFO] serf: EventMemberJoin: Node 48ced1d3-ece0-27ae-67d9-0c41caa532af 127.0.0.1
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.279075 [INFO] consul: Handled member-join event for server "Node 48ced1d3-ece0-27ae-67d9-0c41caa532af.dc1" in area "wan"
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.279461 [INFO] consul: Adding LAN server Node 48ced1d3-ece0-27ae-67d9-0c41caa532af (Addr: tcp/127.0.0.1:20542) (DC: dc1)
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.280112 [INFO] agent: Started DNS server 127.0.0.1:20537 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.281328 [INFO] serf: EventMemberJoin: Node eab7861f-25c9-623c-abef-cebb0f5f1108.dc1 127.0.0.1
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.282973 [INFO] agent: Started DNS server 127.0.0.1:20537 (udp)
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.285545 [INFO] agent: Started HTTP server on 127.0.0.1:20538 (tcp)
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.285657 [INFO] agent: started state syncer
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.287766 [INFO] serf: EventMemberJoin: Node eab7861f-25c9-623c-abef-cebb0f5f1108 127.0.0.1
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.288628 [INFO] consul: Adding LAN server Node eab7861f-25c9-623c-abef-cebb0f5f1108 (Addr: tcp/127.0.0.1:20536) (DC: dc1)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.289771 [INFO] consul: Handled member-join event for server "Node eab7861f-25c9-623c-abef-cebb0f5f1108.dc1" in area "wan"
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.290568 [INFO] agent: Started DNS server 127.0.0.1:20531 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.290641 [INFO] agent: Started DNS server 127.0.0.1:20531 (udp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.293381 [INFO] agent: Started HTTP server on 127.0.0.1:20532 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.293628 [INFO] agent: started state syncer
2019/12/30 19:08:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20536 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20542 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f32f71b3-1df8-d58d-fca8-866b8b424f86 Address:127.0.0.1:20548}]
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20548 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Keys - 2019/12/30 19:08:23.591074 [INFO] serf: EventMemberJoin: Node f32f71b3-1df8-d58d-fca8-866b8b424f86.dc1 127.0.0.1
TestKVGetCommand_Keys - 2019/12/30 19:08:23.594516 [INFO] serf: EventMemberJoin: Node f32f71b3-1df8-d58d-fca8-866b8b424f86 127.0.0.1
TestKVGetCommand_Keys - 2019/12/30 19:08:23.595409 [INFO] consul: Handled member-join event for server "Node f32f71b3-1df8-d58d-fca8-866b8b424f86.dc1" in area "wan"
TestKVGetCommand_Keys - 2019/12/30 19:08:23.595723 [INFO] consul: Adding LAN server Node f32f71b3-1df8-d58d-fca8-866b8b424f86 (Addr: tcp/127.0.0.1:20548) (DC: dc1)
TestKVGetCommand_Keys - 2019/12/30 19:08:23.596487 [INFO] agent: Started DNS server 127.0.0.1:20543 (tcp)
TestKVGetCommand_Keys - 2019/12/30 19:08:23.596589 [INFO] agent: Started DNS server 127.0.0.1:20543 (udp)
TestKVGetCommand_Keys - 2019/12/30 19:08:23.599043 [INFO] agent: Started HTTP server on 127.0.0.1:20544 (tcp)
TestKVGetCommand_Keys - 2019/12/30 19:08:23.599167 [INFO] agent: started state syncer
2019/12/30 19:08:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20548 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20530 [Leader] entering Leader state
TestKVGetCommand - 2019/12/30 19:08:23.686071 [INFO] consul: cluster leadership acquired
TestKVGetCommand - 2019/12/30 19:08:23.686505 [INFO] consul: New leader elected: Node d3f948cb-8979-0d54-8803-701a7d641862
2019/12/30 19:08:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20536 [Leader] entering Leader state
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.834474 [INFO] consul: cluster leadership acquired
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:23.834904 [INFO] consul: New leader elected: Node eab7861f-25c9-623c-abef-cebb0f5f1108
2019/12/30 19:08:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:23 [INFO]  raft: Node at 127.0.0.1:20542 [Leader] entering Leader state
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.837959 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Recurse - 2019/12/30 19:08:23.838409 [INFO] consul: New leader elected: Node 48ced1d3-ece0-27ae-67d9-0c41caa532af
TestKVGetCommand - 2019/12/30 19:08:24.002073 [INFO] agent: Synced node info
TestKVGetCommand - 2019/12/30 19:08:24.002956 [DEBUG] http: Request PUT /v1/kv/foo (281.620643ms) from=127.0.0.1:34626
TestKVGetCommand - 2019/12/30 19:08:24.011323 [DEBUG] http: Request GET /v1/kv/foo (1.617711ms) from=127.0.0.1:34628
TestKVGetCommand - 2019/12/30 19:08:24.013871 [INFO] agent: Requesting shutdown
TestKVGetCommand - 2019/12/30 19:08:24.013981 [INFO] consul: shutting down server
TestKVGetCommand - 2019/12/30 19:08:24.014029 [WARN] serf: Shutdown without a Leave
2019/12/30 19:08:24 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:24 [INFO]  raft: Node at 127.0.0.1:20548 [Leader] entering Leader state
TestKVGetCommand - 2019/12/30 19:08:24.187037 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/12/30 19:08:24.189020 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Keys - 2019/12/30 19:08:24.189514 [INFO] consul: New leader elected: Node f32f71b3-1df8-d58d-fca8-866b8b424f86
TestKVGetCommand_Recurse - 2019/12/30 19:08:24.302961 [INFO] agent: Synced node info
TestKVGetCommand_Recurse - 2019/12/30 19:08:24.303091 [DEBUG] agent: Node info in sync
TestKVGetCommand - 2019/12/30 19:08:24.304491 [INFO] manager: shutting down
TestKVGetCommand_Recurse - 2019/12/30 19:08:24.470126 [DEBUG] http: Request PUT /v1/kv/foo/a (325.384164ms) from=127.0.0.1:35102
TestKVGetCommand - 2019/12/30 19:08:24.559800 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVGetCommand - 2019/12/30 19:08:24.559908 [INFO] agent: consul server down
TestKVGetCommand - 2019/12/30 19:08:24.559961 [INFO] agent: shutdown complete
TestKVGetCommand - 2019/12/30 19:08:24.559992 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand - 2019/12/30 19:08:24.560016 [INFO] agent: Stopping DNS server 127.0.0.1:20525 (tcp)
TestKVGetCommand - 2019/12/30 19:08:24.560242 [INFO] agent: Stopping DNS server 127.0.0.1:20525 (udp)
TestKVGetCommand - 2019/12/30 19:08:24.560435 [INFO] agent: Stopping HTTP server 127.0.0.1:20526 (tcp)
TestKVGetCommand - 2019/12/30 19:08:24.561247 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand - 2019/12/30 19:08:24.561349 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand (2.51s)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:24.655298 [INFO] agent: Synced node info
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:24.655766 [DEBUG] http: Request PUT /v1/kv/foo/a (411.252826ms) from=127.0.0.1:56916
TestKVGetCommand_Keys - 2019/12/30 19:08:24.736566 [INFO] agent: Synced node info
TestKVGetCommand_Keys - 2019/12/30 19:08:24.737288 [DEBUG] http: Request PUT /v1/kv/foo/bar (469.377071ms) from=127.0.0.1:37272
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:25.620832 [DEBUG] http: Request PUT /v1/kv/foo/b (963.059467ms) from=127.0.0.1:56916
TestKVGetCommand_Recurse - 2019/12/30 19:08:25.702759 [DEBUG] http: Request PUT /v1/kv/foo/b (1.22935036s) from=127.0.0.1:35102
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.310780 [DEBUG] agent: Node info in sync
TestKVGetCommand_Keys - 2019/12/30 19:08:26.357169 [DEBUG] http: Request PUT /v1/kv/foo/baz (1.617379888s) from=127.0.0.1:37272
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.359079 [DEBUG] http: Request PUT /v1/kv/foo/c (735.783965ms) from=127.0.0.1:56916
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.381215 [DEBUG] http: Request PUT /v1/kv/foo/c (674.707975ms) from=127.0.0.1:35102
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.398064 [DEBUG] http: Request GET /v1/kv/foo?recurse= (2.150392ms) from=127.0.0.1:35108
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.406319 [DEBUG] http: Request GET /v1/kv/foo?recurse= (1.2247ms) from=127.0.0.1:56922
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.406716 [INFO] agent: Requesting shutdown
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.406807 [INFO] consul: shutting down server
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.406902 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.408705 [INFO] agent: Requesting shutdown
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.408863 [INFO] consul: shutting down server
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.408928 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.509697 [DEBUG] agent: Node info in sync
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.509814 [DEBUG] agent: Node info in sync
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.560081 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.561635 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.658486 [INFO] manager: shutting down
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.659046 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.661266 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.662014 [INFO] manager: shutting down
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.665931 [WARN] consul: error getting server health from "Node 48ced1d3-ece0-27ae-67d9-0c41caa532af": rpc error making call: EOF
TestKVGetCommand_Keys - 2019/12/30 19:08:26.677900 [DEBUG] agent: Node info in sync
TestKVGetCommand_Keys - 2019/12/30 19:08:26.678017 [DEBUG] agent: Node info in sync
TestKVGetCommand_Keys - 2019/12/30 19:08:26.777123 [DEBUG] http: Request PUT /v1/kv/foo/zip (372.063429ms) from=127.0.0.1:37272
TestKVGetCommand_Keys - 2019/12/30 19:08:26.781774 [DEBUG] http: Request GET /v1/kv/foo/?keys=&separator=%2F (1.562709ms) from=127.0.0.1:37278
TestKVGetCommand_Keys - 2019/12/30 19:08:26.783244 [INFO] agent: Requesting shutdown
TestKVGetCommand_Keys - 2019/12/30 19:08:26.783336 [INFO] consul: shutting down server
TestKVGetCommand_Keys - 2019/12/30 19:08:26.783383 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/12/30 19:08:26.860324 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/12/30 19:08:26.942541 [INFO] manager: shutting down
TestKVGetCommand_Keys - 2019/12/30 19:08:26.942587 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.942957 [INFO] agent: consul server down
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.943008 [INFO] agent: shutdown complete
TestKVGetCommand_Keys - 2019/12/30 19:08:26.942964 [INFO] agent: consul server down
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.943058 [INFO] agent: Stopping DNS server 127.0.0.1:20537 (tcp)
TestKVGetCommand_Keys - 2019/12/30 19:08:26.943082 [INFO] agent: shutdown complete
TestKVGetCommand_Keys - 2019/12/30 19:08:26.943134 [INFO] agent: Stopping DNS server 127.0.0.1:20543 (tcp)
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.943180 [INFO] agent: Stopping DNS server 127.0.0.1:20537 (udp)
TestKVGetCommand_Keys - 2019/12/30 19:08:26.943247 [INFO] agent: Stopping DNS server 127.0.0.1:20543 (udp)
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.943321 [INFO] agent: Stopping HTTP server 127.0.0.1:20538 (tcp)
TestKVGetCommand_Keys - 2019/12/30 19:08:26.943381 [INFO] agent: Stopping HTTP server 127.0.0.1:20544 (tcp)
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.943933 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Keys - 2019/12/30 19:08:26.943932 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.944001 [ERR] connect: Apply failed leadership lost while committing log
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.944054 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVGetCommand_Recurse - 2019/12/30 19:08:26.944070 [INFO] agent: Endpoints down
TestKVGetCommand_Keys - 2019/12/30 19:08:26.944159 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Recurse (4.73s)
--- PASS: TestKVGetCommand_Keys (4.32s)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.945524 [INFO] agent: consul server down
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.945592 [INFO] agent: shutdown complete
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.945646 [INFO] agent: Stopping DNS server 127.0.0.1:20531 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.945776 [INFO] agent: Stopping DNS server 127.0.0.1:20531 (udp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.945918 [INFO] agent: Stopping HTTP server 127.0.0.1:20532 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.946300 [ERR] connect: Apply failed leadership lost while committing log
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.946354 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.946511 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_RecurseBase64 - 2019/12/30 19:08:26.946555 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_RecurseBase64 (4.73s)
PASS
ok  	github.com/hashicorp/consul/command/kv/get	8.909s
=== RUN   TestKVImportCommand_noTabs
=== PAUSE TestKVImportCommand_noTabs
=== RUN   TestKVImportCommand
=== PAUSE TestKVImportCommand
=== CONT  TestKVImportCommand_noTabs
=== CONT  TestKVImportCommand
--- PASS: TestKVImportCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVImportCommand - 2019/12/30 19:08:28.665334 [WARN] agent: Node name "Node 4a8da2ce-8fbe-81b8-23d1-854e92d02c73" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVImportCommand - 2019/12/30 19:08:28.667225 [DEBUG] tlsutil: Update with version 1
TestKVImportCommand - 2019/12/30 19:08:28.675745 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:08:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4a8da2ce-8fbe-81b8-23d1-854e92d02c73 Address:127.0.0.1:23506}]
2019/12/30 19:08:30 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestKVImportCommand - 2019/12/30 19:08:31.004028 [INFO] serf: EventMemberJoin: Node 4a8da2ce-8fbe-81b8-23d1-854e92d02c73.dc1 127.0.0.1
TestKVImportCommand - 2019/12/30 19:08:31.007831 [INFO] serf: EventMemberJoin: Node 4a8da2ce-8fbe-81b8-23d1-854e92d02c73 127.0.0.1
TestKVImportCommand - 2019/12/30 19:08:31.011400 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestKVImportCommand - 2019/12/30 19:08:31.011644 [INFO] consul: Adding LAN server Node 4a8da2ce-8fbe-81b8-23d1-854e92d02c73 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestKVImportCommand - 2019/12/30 19:08:31.011749 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestKVImportCommand - 2019/12/30 19:08:31.013132 [INFO] consul: Handled member-join event for server "Node 4a8da2ce-8fbe-81b8-23d1-854e92d02c73.dc1" in area "wan"
TestKVImportCommand - 2019/12/30 19:08:31.014729 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestKVImportCommand - 2019/12/30 19:08:31.015162 [INFO] agent: started state syncer
2019/12/30 19:08:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:31 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:31 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestKVImportCommand - 2019/12/30 19:08:31.559938 [INFO] consul: cluster leadership acquired
TestKVImportCommand - 2019/12/30 19:08:31.560552 [INFO] consul: New leader elected: Node 4a8da2ce-8fbe-81b8-23d1-854e92d02c73
TestKVImportCommand - 2019/12/30 19:08:31.860368 [INFO] agent: Synced node info
TestKVImportCommand - 2019/12/30 19:08:31.860506 [DEBUG] agent: Node info in sync
TestKVImportCommand - 2019/12/30 19:08:32.035340 [DEBUG] http: Request PUT /v1/kv/foo (319.954346ms) from=127.0.0.1:38132
TestKVImportCommand - 2019/12/30 19:08:32.224939 [DEBUG] agent: Node info in sync
TestKVImportCommand - 2019/12/30 19:08:32.528359 [DEBUG] http: Request PUT /v1/kv/foo/a (487.785564ms) from=127.0.0.1:38132
TestKVImportCommand - 2019/12/30 19:08:32.533440 [DEBUG] http: Request GET /v1/kv/foo (1.767381ms) from=127.0.0.1:38134
TestKVImportCommand - 2019/12/30 19:08:32.537038 [DEBUG] http: Request GET /v1/kv/foo/a (1.180032ms) from=127.0.0.1:38134
TestKVImportCommand - 2019/12/30 19:08:32.539122 [INFO] agent: Requesting shutdown
TestKVImportCommand - 2019/12/30 19:08:32.539220 [INFO] consul: shutting down server
TestKVImportCommand - 2019/12/30 19:08:32.539271 [WARN] serf: Shutdown without a Leave
TestKVImportCommand - 2019/12/30 19:08:32.780190 [WARN] serf: Shutdown without a Leave
TestKVImportCommand - 2019/12/30 19:08:33.592676 [INFO] manager: shutting down
TestKVImportCommand - 2019/12/30 19:08:33.684332 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestKVImportCommand - 2019/12/30 19:08:33.684585 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVImportCommand - 2019/12/30 19:08:33.684651 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVImportCommand - 2019/12/30 19:08:33.685182 [INFO] agent: consul server down
TestKVImportCommand - 2019/12/30 19:08:33.685314 [INFO] agent: shutdown complete
TestKVImportCommand - 2019/12/30 19:08:33.685529 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestKVImportCommand - 2019/12/30 19:08:33.685676 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestKVImportCommand - 2019/12/30 19:08:33.685827 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestKVImportCommand - 2019/12/30 19:08:33.686433 [INFO] agent: Waiting for endpoints to shut down
TestKVImportCommand - 2019/12/30 19:08:33.686513 [INFO] agent: Endpoints down
--- PASS: TestKVImportCommand (5.13s)
PASS
ok  	github.com/hashicorp/consul/command/kv/imp	5.663s
?   	github.com/hashicorp/consul/command/kv/impexp	[no test files]
=== RUN   TestKVPutCommand_noTabs
=== PAUSE TestKVPutCommand_noTabs
=== RUN   TestKVPutCommand_Validation
=== PAUSE TestKVPutCommand_Validation
=== RUN   TestKVPutCommand
=== PAUSE TestKVPutCommand
=== RUN   TestKVPutCommand_EmptyDataQuoted
--- SKIP: TestKVPutCommand_EmptyDataQuoted (0.00s)
    kv_put_test.go:108: DM-skipped
=== RUN   TestKVPutCommand_Base64
=== PAUSE TestKVPutCommand_Base64
=== RUN   TestKVPutCommand_File
=== PAUSE TestKVPutCommand_File
=== RUN   TestKVPutCommand_FileNoExist
=== PAUSE TestKVPutCommand_FileNoExist
=== RUN   TestKVPutCommand_Stdin
=== PAUSE TestKVPutCommand_Stdin
=== RUN   TestKVPutCommand_NegativeVal
=== PAUSE TestKVPutCommand_NegativeVal
=== RUN   TestKVPutCommand_Flags
=== PAUSE TestKVPutCommand_Flags
=== RUN   TestKVPutCommand_CAS
=== PAUSE TestKVPutCommand_CAS
=== CONT  TestKVPutCommand_noTabs
--- PASS: TestKVPutCommand_noTabs (0.00s)
=== CONT  TestKVPutCommand_CAS
=== CONT  TestKVPutCommand_NegativeVal
=== CONT  TestKVPutCommand_FileNoExist
--- PASS: TestKVPutCommand_FileNoExist (0.00s)
=== CONT  TestKVPutCommand_Stdin
=== CONT  TestKVPutCommand_Flags
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_CAS - 2019/12/30 19:08:37.891314 [WARN] agent: Node name "Node f2fdbf92-18ca-e876-11a4-f7bd6850b082" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_CAS - 2019/12/30 19:08:37.892118 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_CAS - 2019/12/30 19:08:37.898429 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Stdin - 2019/12/30 19:08:37.948085 [WARN] agent: Node name "Node 4e5b46f0-821a-5579-fcee-3061f576e176" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Flags - 2019/12/30 19:08:37.948293 [WARN] agent: Node name "Node a15b60db-77f4-0c45-5fae-de46ba027ae5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Stdin - 2019/12/30 19:08:37.948909 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Flags - 2019/12/30 19:08:37.949084 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:37.950122 [WARN] agent: Node name "Node e1bd0919-99c4-ee0c-fcc7-32ae05d7248a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:37.950610 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Flags - 2019/12/30 19:08:37.953285 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:37.954015 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_Stdin - 2019/12/30 19:08:37.958877 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:08:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f2fdbf92-18ca-e876-11a4-f7bd6850b082 Address:127.0.0.1:26524}]
2019/12/30 19:08:38 [INFO]  raft: Node at 127.0.0.1:26524 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_CAS - 2019/12/30 19:08:38.848810 [INFO] serf: EventMemberJoin: Node f2fdbf92-18ca-e876-11a4-f7bd6850b082.dc1 127.0.0.1
TestKVPutCommand_CAS - 2019/12/30 19:08:38.857361 [INFO] serf: EventMemberJoin: Node f2fdbf92-18ca-e876-11a4-f7bd6850b082 127.0.0.1
TestKVPutCommand_CAS - 2019/12/30 19:08:38.858744 [INFO] consul: Adding LAN server Node f2fdbf92-18ca-e876-11a4-f7bd6850b082 (Addr: tcp/127.0.0.1:26524) (DC: dc1)
TestKVPutCommand_CAS - 2019/12/30 19:08:38.860294 [INFO] consul: Handled member-join event for server "Node f2fdbf92-18ca-e876-11a4-f7bd6850b082.dc1" in area "wan"
TestKVPutCommand_CAS - 2019/12/30 19:08:38.865060 [INFO] agent: Started DNS server 127.0.0.1:26519 (udp)
TestKVPutCommand_CAS - 2019/12/30 19:08:38.865385 [INFO] agent: Started DNS server 127.0.0.1:26519 (tcp)
TestKVPutCommand_CAS - 2019/12/30 19:08:38.869487 [INFO] agent: Started HTTP server on 127.0.0.1:26520 (tcp)
TestKVPutCommand_CAS - 2019/12/30 19:08:38.869647 [INFO] agent: started state syncer
2019/12/30 19:08:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:38 [INFO]  raft: Node at 127.0.0.1:26524 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e1bd0919-99c4-ee0c-fcc7-32ae05d7248a Address:127.0.0.1:26506}]
2019/12/30 19:08:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e5b46f0-821a-5579-fcee-3061f576e176 Address:127.0.0.1:26512}]
TestKVPutCommand_Stdin - 2019/12/30 19:08:38.951627 [INFO] serf: EventMemberJoin: Node 4e5b46f0-821a-5579-fcee-3061f576e176.dc1 127.0.0.1
2019/12/30 19:08:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a15b60db-77f4-0c45-5fae-de46ba027ae5 Address:127.0.0.1:26518}]
TestKVPutCommand_Stdin - 2019/12/30 19:08:38.956909 [INFO] serf: EventMemberJoin: Node 4e5b46f0-821a-5579-fcee-3061f576e176 127.0.0.1
2019/12/30 19:08:38 [INFO]  raft: Node at 127.0.0.1:26512 [Follower] entering Follower state (Leader: "")
2019/12/30 19:08:38 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
2019/12/30 19:08:38 [INFO]  raft: Node at 127.0.0.1:26518 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:38.958725 [INFO] serf: EventMemberJoin: Node e1bd0919-99c4-ee0c-fcc7-32ae05d7248a.dc1 127.0.0.1
TestKVPutCommand_Stdin - 2019/12/30 19:08:38.958888 [INFO] consul: Handled member-join event for server "Node 4e5b46f0-821a-5579-fcee-3061f576e176.dc1" in area "wan"
TestKVPutCommand_Stdin - 2019/12/30 19:08:38.969924 [INFO] consul: Adding LAN server Node 4e5b46f0-821a-5579-fcee-3061f576e176 (Addr: tcp/127.0.0.1:26512) (DC: dc1)
TestKVPutCommand_Flags - 2019/12/30 19:08:39.001536 [INFO] serf: EventMemberJoin: Node a15b60db-77f4-0c45-5fae-de46ba027ae5.dc1 127.0.0.1
2019/12/30 19:08:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26518 [Candidate] entering Candidate state in term 2
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.007212 [INFO] serf: EventMemberJoin: Node e1bd0919-99c4-ee0c-fcc7-32ae05d7248a 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.008520 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.010399 [INFO] agent: Started DNS server 127.0.0.1:26507 (udp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.010768 [INFO] agent: Started DNS server 127.0.0.1:26507 (tcp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.012960 [INFO] agent: Started HTTP server on 127.0.0.1:26508 (tcp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.013080 [INFO] agent: started state syncer
2019/12/30 19:08:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.013943 [INFO] consul: Handled member-join event for server "Node e1bd0919-99c4-ee0c-fcc7-32ae05d7248a.dc1" in area "wan"
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.014594 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.016852 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.017074 [INFO] agent: started state syncer
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.017573 [INFO] consul: Adding LAN server Node e1bd0919-99c4-ee0c-fcc7-32ae05d7248a (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestKVPutCommand_Flags - 2019/12/30 19:08:39.019274 [INFO] serf: EventMemberJoin: Node a15b60db-77f4-0c45-5fae-de46ba027ae5 127.0.0.1
TestKVPutCommand_Flags - 2019/12/30 19:08:39.020587 [INFO] consul: Adding LAN server Node a15b60db-77f4-0c45-5fae-de46ba027ae5 (Addr: tcp/127.0.0.1:26518) (DC: dc1)
TestKVPutCommand_Flags - 2019/12/30 19:08:39.020997 [INFO] consul: Handled member-join event for server "Node a15b60db-77f4-0c45-5fae-de46ba027ae5.dc1" in area "wan"
TestKVPutCommand_Flags - 2019/12/30 19:08:39.024337 [INFO] agent: Started DNS server 127.0.0.1:26513 (udp)
TestKVPutCommand_Flags - 2019/12/30 19:08:39.024534 [INFO] agent: Started DNS server 127.0.0.1:26513 (tcp)
2019/12/30 19:08:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26512 [Candidate] entering Candidate state in term 2
TestKVPutCommand_Flags - 2019/12/30 19:08:39.047365 [INFO] agent: Started HTTP server on 127.0.0.1:26514 (tcp)
TestKVPutCommand_Flags - 2019/12/30 19:08:39.047623 [INFO] agent: started state syncer
2019/12/30 19:08:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26524 [Leader] entering Leader state
TestKVPutCommand_CAS - 2019/12/30 19:08:39.429310 [INFO] consul: cluster leadership acquired
TestKVPutCommand_CAS - 2019/12/30 19:08:39.429878 [INFO] consul: New leader elected: Node f2fdbf92-18ca-e876-11a4-f7bd6850b082
2019/12/30 19:08:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26518 [Leader] entering Leader state
2019/12/30 19:08:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.594226 [INFO] consul: cluster leadership acquired
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.594738 [INFO] consul: New leader elected: Node e1bd0919-99c4-ee0c-fcc7-32ae05d7248a
TestKVPutCommand_Flags - 2019/12/30 19:08:39.595005 [INFO] consul: cluster leadership acquired
2019/12/30 19:08:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:39 [INFO]  raft: Node at 127.0.0.1:26512 [Leader] entering Leader state
TestKVPutCommand_Flags - 2019/12/30 19:08:39.595319 [INFO] consul: New leader elected: Node a15b60db-77f4-0c45-5fae-de46ba027ae5
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.595539 [INFO] consul: cluster leadership acquired
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.595855 [INFO] consul: New leader elected: Node 4e5b46f0-821a-5579-fcee-3061f576e176
TestKVPutCommand_CAS - 2019/12/30 19:08:39.780540 [INFO] agent: Synced node info
TestKVPutCommand_CAS - 2019/12/30 19:08:39.780659 [DEBUG] agent: Node info in sync
TestKVPutCommand_CAS - 2019/12/30 19:08:39.781834 [DEBUG] http: Request PUT /v1/kv/foo (291.656574ms) from=127.0.0.1:45910
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.935611 [INFO] agent: Synced node info
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.935753 [DEBUG] agent: Node info in sync
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.936299 [DEBUG] http: Request PUT /v1/kv/foo (204.56188ms) from=127.0.0.1:59810
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.943072 [INFO] agent: Synced node info
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.943208 [DEBUG] agent: Node info in sync
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.948199 [DEBUG] http: Request PUT /v1/kv/foo (345.915712ms) from=127.0.0.1:34136
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.969450 [DEBUG] http: Request GET /v1/kv/foo (5.76549ms) from=127.0.0.1:34146
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.981945 [INFO] agent: Requesting shutdown
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.982035 [INFO] consul: shutting down server
TestKVPutCommand_Stdin - 2019/12/30 19:08:39.982085 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.996916 [DEBUG] http: Request GET /v1/kv/foo (54.57348ms) from=127.0.0.1:59814
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.998861 [INFO] agent: Requesting shutdown
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.998970 [INFO] consul: shutting down server
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:39.999021 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.084492 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Flags - 2019/12/30 19:08:40.086094 [INFO] agent: Synced node info
TestKVPutCommand_CAS - 2019/12/30 19:08:40.088441 [DEBUG] agent: Node info in sync
TestKVPutCommand_Flags - 2019/12/30 19:08:40.088918 [DEBUG] http: Request PUT /v1/kv/foo?flags=12345 (404.104623ms) from=127.0.0.1:38824
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.090816 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Flags - 2019/12/30 19:08:40.093632 [DEBUG] http: Request GET /v1/kv/foo (1.096363ms) from=127.0.0.1:38834
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.095459 [DEBUG] agent: Node info in sync
TestKVPutCommand_Flags - 2019/12/30 19:08:40.099018 [INFO] agent: Requesting shutdown
TestKVPutCommand_Flags - 2019/12/30 19:08:40.099136 [INFO] consul: shutting down server
TestKVPutCommand_Flags - 2019/12/30 19:08:40.099185 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Flags - 2019/12/30 19:08:40.167729 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.167840 [INFO] manager: shutting down
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.168567 [INFO] agent: consul server down
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.168631 [INFO] agent: shutdown complete
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.168693 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (tcp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.168720 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.168856 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (udp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.169011 [INFO] agent: Stopping HTTP server 127.0.0.1:26508 (tcp)
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.169041 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.169705 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Stdin - 2019/12/30 19:08:40.169801 [INFO] agent: Endpoints down
=== CONT  TestKVPutCommand_Base64
--- PASS: TestKVPutCommand_Stdin (2.41s)
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.174689 [INFO] manager: shutting down
TestKVPutCommand_Flags - 2019/12/30 19:08:40.251827 [INFO] manager: shutting down
TestKVPutCommand_CAS - 2019/12/30 19:08:40.253870 [DEBUG] http: Request PUT /v1/kv/foo?cas=123 (465.657625ms) from=127.0.0.1:45918
TestKVPutCommand_Flags - 2019/12/30 19:08:40.254354 [INFO] agent: consul server down
TestKVPutCommand_Flags - 2019/12/30 19:08:40.254517 [INFO] agent: shutdown complete
TestKVPutCommand_Flags - 2019/12/30 19:08:40.254607 [INFO] agent: Stopping DNS server 127.0.0.1:26513 (tcp)
TestKVPutCommand_Flags - 2019/12/30 19:08:40.254772 [INFO] agent: Stopping DNS server 127.0.0.1:26513 (udp)
TestKVPutCommand_Flags - 2019/12/30 19:08:40.254894 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVPutCommand_Flags - 2019/12/30 19:08:40.254957 [INFO] agent: Stopping HTTP server 127.0.0.1:26514 (tcp)
TestKVPutCommand_Flags - 2019/12/30 19:08:40.255208 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_Flags - 2019/12/30 19:08:40.255700 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Flags - 2019/12/30 19:08:40.255787 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Flags (2.49s)
=== CONT  TestKVPutCommand_File
TestKVPutCommand_CAS - 2019/12/30 19:08:40.262376 [DEBUG] http: Request GET /v1/kv/foo (668.351µs) from=127.0.0.1:45910
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Base64 - 2019/12/30 19:08:40.288234 [WARN] agent: Node name "Node 23d0ec79-b386-51e2-fb9b-9e02eb655049" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Base64 - 2019/12/30 19:08:40.288771 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Base64 - 2019/12/30 19:08:40.300660 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_File - 2019/12/30 19:08:40.396400 [WARN] agent: Node name "Node e3eadc5a-58f7-5719-b608-5c3db8bfd0f9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_File - 2019/12/30 19:08:40.396792 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_File - 2019/12/30 19:08:40.398921 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.444757 [INFO] agent: consul server down
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.444840 [INFO] agent: shutdown complete
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.444904 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.445057 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.445224 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.445693 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.445943 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.445963 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.446051 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVPutCommand_NegativeVal - 2019/12/30 19:08:40.446151 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_NegativeVal (2.70s)
=== CONT  TestKVPutCommand
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand - 2019/12/30 19:08:40.557884 [WARN] agent: Node name "Node 1242429c-38c2-a508-6e93-16765730999e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand - 2019/12/30 19:08:40.569847 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand - 2019/12/30 19:08:40.572351 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_CAS - 2019/12/30 19:08:40.779954 [DEBUG] http: Request PUT /v1/kv/foo?cas=4 (513.084577ms) from=127.0.0.1:45926
TestKVPutCommand_CAS - 2019/12/30 19:08:40.784053 [DEBUG] http: Request GET /v1/kv/foo (810.689µs) from=127.0.0.1:45910
TestKVPutCommand_CAS - 2019/12/30 19:08:40.785538 [INFO] agent: Requesting shutdown
TestKVPutCommand_CAS - 2019/12/30 19:08:40.785643 [INFO] consul: shutting down server
TestKVPutCommand_CAS - 2019/12/30 19:08:40.785691 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/12/30 19:08:40.850995 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/12/30 19:08:40.960966 [INFO] manager: shutting down
TestKVPutCommand_CAS - 2019/12/30 19:08:40.962220 [INFO] agent: consul server down
TestKVPutCommand_CAS - 2019/12/30 19:08:40.962286 [INFO] agent: shutdown complete
TestKVPutCommand_CAS - 2019/12/30 19:08:40.962340 [INFO] agent: Stopping DNS server 127.0.0.1:26519 (tcp)
TestKVPutCommand_CAS - 2019/12/30 19:08:40.962467 [INFO] agent: Stopping DNS server 127.0.0.1:26519 (udp)
TestKVPutCommand_CAS - 2019/12/30 19:08:40.962633 [INFO] agent: Stopping HTTP server 127.0.0.1:26520 (tcp)
TestKVPutCommand_CAS - 2019/12/30 19:08:40.963466 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_CAS - 2019/12/30 19:08:40.963609 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_CAS (3.21s)
=== CONT  TestKVPutCommand_Validation
TestKVPutCommand_CAS - 2019/12/30 19:08:40.983258 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
--- PASS: TestKVPutCommand_Validation (0.02s)
2019/12/30 19:08:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:23d0ec79-b386-51e2-fb9b-9e02eb655049 Address:127.0.0.1:26530}]
2019/12/30 19:08:41 [INFO]  raft: Node at 127.0.0.1:26530 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.305386 [INFO] serf: EventMemberJoin: Node 23d0ec79-b386-51e2-fb9b-9e02eb655049.dc1 127.0.0.1
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.308832 [INFO] serf: EventMemberJoin: Node 23d0ec79-b386-51e2-fb9b-9e02eb655049 127.0.0.1
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.309523 [INFO] consul: Handled member-join event for server "Node 23d0ec79-b386-51e2-fb9b-9e02eb655049.dc1" in area "wan"
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.309859 [INFO] consul: Adding LAN server Node 23d0ec79-b386-51e2-fb9b-9e02eb655049 (Addr: tcp/127.0.0.1:26530) (DC: dc1)
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.310215 [INFO] agent: Started DNS server 127.0.0.1:26525 (udp)
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.310288 [INFO] agent: Started DNS server 127.0.0.1:26525 (tcp)
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.312903 [INFO] agent: Started HTTP server on 127.0.0.1:26526 (tcp)
TestKVPutCommand_Base64 - 2019/12/30 19:08:41.313008 [INFO] agent: started state syncer
2019/12/30 19:08:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:41 [INFO]  raft: Node at 127.0.0.1:26530 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e3eadc5a-58f7-5719-b608-5c3db8bfd0f9 Address:127.0.0.1:26536}]
2019/12/30 19:08:41 [INFO]  raft: Node at 127.0.0.1:26536 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_File - 2019/12/30 19:08:41.397203 [INFO] serf: EventMemberJoin: Node e3eadc5a-58f7-5719-b608-5c3db8bfd0f9.dc1 127.0.0.1
TestKVPutCommand_File - 2019/12/30 19:08:41.400688 [INFO] serf: EventMemberJoin: Node e3eadc5a-58f7-5719-b608-5c3db8bfd0f9 127.0.0.1
TestKVPutCommand_File - 2019/12/30 19:08:41.401902 [INFO] consul: Adding LAN server Node e3eadc5a-58f7-5719-b608-5c3db8bfd0f9 (Addr: tcp/127.0.0.1:26536) (DC: dc1)
TestKVPutCommand_File - 2019/12/30 19:08:41.402030 [INFO] agent: Started DNS server 127.0.0.1:26531 (udp)
TestKVPutCommand_File - 2019/12/30 19:08:41.402201 [INFO] consul: Handled member-join event for server "Node e3eadc5a-58f7-5719-b608-5c3db8bfd0f9.dc1" in area "wan"
TestKVPutCommand_File - 2019/12/30 19:08:41.402391 [INFO] agent: Started DNS server 127.0.0.1:26531 (tcp)
TestKVPutCommand_File - 2019/12/30 19:08:41.405143 [INFO] agent: Started HTTP server on 127.0.0.1:26532 (tcp)
TestKVPutCommand_File - 2019/12/30 19:08:41.405253 [INFO] agent: started state syncer
2019/12/30 19:08:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:41 [INFO]  raft: Node at 127.0.0.1:26536 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1242429c-38c2-a508-6e93-16765730999e Address:127.0.0.1:26542}]
2019/12/30 19:08:41 [INFO]  raft: Node at 127.0.0.1:26542 [Follower] entering Follower state (Leader: "")
TestKVPutCommand - 2019/12/30 19:08:41.692278 [INFO] serf: EventMemberJoin: Node 1242429c-38c2-a508-6e93-16765730999e.dc1 127.0.0.1
TestKVPutCommand - 2019/12/30 19:08:41.701707 [INFO] serf: EventMemberJoin: Node 1242429c-38c2-a508-6e93-16765730999e 127.0.0.1
TestKVPutCommand - 2019/12/30 19:08:41.703608 [INFO] consul: Handled member-join event for server "Node 1242429c-38c2-a508-6e93-16765730999e.dc1" in area "wan"
TestKVPutCommand - 2019/12/30 19:08:41.703721 [INFO] consul: Adding LAN server Node 1242429c-38c2-a508-6e93-16765730999e (Addr: tcp/127.0.0.1:26542) (DC: dc1)
TestKVPutCommand - 2019/12/30 19:08:41.706698 [INFO] agent: Started DNS server 127.0.0.1:26537 (tcp)
TestKVPutCommand - 2019/12/30 19:08:41.707124 [INFO] agent: Started DNS server 127.0.0.1:26537 (udp)
TestKVPutCommand - 2019/12/30 19:08:41.709615 [INFO] agent: Started HTTP server on 127.0.0.1:26538 (tcp)
TestKVPutCommand - 2019/12/30 19:08:41.709724 [INFO] agent: started state syncer
2019/12/30 19:08:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:41 [INFO]  raft: Node at 127.0.0.1:26542 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:26536 [Leader] entering Leader state
2019/12/30 19:08:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:26530 [Leader] entering Leader state
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.019457 [INFO] consul: cluster leadership acquired
TestKVPutCommand_File - 2019/12/30 19:08:42.019579 [INFO] consul: cluster leadership acquired
TestKVPutCommand_File - 2019/12/30 19:08:42.019904 [INFO] consul: New leader elected: Node e3eadc5a-58f7-5719-b608-5c3db8bfd0f9
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.019904 [INFO] consul: New leader elected: Node 23d0ec79-b386-51e2-fb9b-9e02eb655049
2019/12/30 19:08:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:26542 [Leader] entering Leader state
TestKVPutCommand - 2019/12/30 19:08:42.263050 [INFO] consul: cluster leadership acquired
TestKVPutCommand - 2019/12/30 19:08:42.263558 [INFO] consul: New leader elected: Node 1242429c-38c2-a508-6e93-16765730999e
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.369949 [INFO] agent: Synced node info
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.370077 [DEBUG] agent: Node info in sync
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.371834 [DEBUG] http: Request PUT /v1/kv/foo (319.364657ms) from=127.0.0.1:35524
TestKVPutCommand_File - 2019/12/30 19:08:42.374806 [DEBUG] http: Request PUT /v1/kv/foo (235.786392ms) from=127.0.0.1:40358
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.377038 [DEBUG] http: Request GET /v1/kv/foo (1.134698ms) from=127.0.0.1:35528
TestKVPutCommand_File - 2019/12/30 19:08:42.378273 [INFO] agent: Synced node info
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.379277 [INFO] agent: Requesting shutdown
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.379430 [INFO] consul: shutting down server
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.379496 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/12/30 19:08:42.382159 [DEBUG] http: Request GET /v1/kv/foo (885.024µs) from=127.0.0.1:40362
TestKVPutCommand_File - 2019/12/30 19:08:42.384045 [INFO] agent: Requesting shutdown
TestKVPutCommand_File - 2019/12/30 19:08:42.384141 [INFO] consul: shutting down server
TestKVPutCommand_File - 2019/12/30 19:08:42.384188 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/12/30 19:08:42.459549 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.460278 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/12/30 19:08:42.567891 [INFO] manager: shutting down
TestKVPutCommand_File - 2019/12/30 19:08:42.567919 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestKVPutCommand_File - 2019/12/30 19:08:42.568317 [INFO] agent: consul server down
TestKVPutCommand_File - 2019/12/30 19:08:42.568370 [INFO] agent: shutdown complete
TestKVPutCommand_File - 2019/12/30 19:08:42.568427 [INFO] agent: Stopping DNS server 127.0.0.1:26531 (tcp)
TestKVPutCommand_File - 2019/12/30 19:08:42.568581 [INFO] agent: Stopping DNS server 127.0.0.1:26531 (udp)
TestKVPutCommand_File - 2019/12/30 19:08:42.568841 [INFO] agent: Stopping HTTP server 127.0.0.1:26532 (tcp)
TestKVPutCommand_File - 2019/12/30 19:08:42.568319 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_File - 2019/12/30 19:08:42.569908 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_File - 2019/12/30 19:08:42.569987 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_File (2.31s)
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.571419 [INFO] manager: shutting down
TestKVPutCommand - 2019/12/30 19:08:42.661837 [DEBUG] http: Request PUT /v1/kv/foo (243.4646ms) from=127.0.0.1:48884
TestKVPutCommand - 2019/12/30 19:08:42.662315 [INFO] agent: Synced node info
TestKVPutCommand - 2019/12/30 19:08:42.668711 [DEBUG] http: Request GET /v1/kv/foo (948.026µs) from=127.0.0.1:48886
TestKVPutCommand - 2019/12/30 19:08:42.671028 [INFO] agent: Requesting shutdown
TestKVPutCommand - 2019/12/30 19:08:42.671119 [INFO] consul: shutting down server
TestKVPutCommand - 2019/12/30 19:08:42.671164 [WARN] serf: Shutdown without a Leave
TestKVPutCommand - 2019/12/30 19:08:42.768811 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.852251 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.852375 [INFO] agent: consul server down
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.852425 [INFO] agent: shutdown complete
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.852484 [INFO] agent: Stopping DNS server 127.0.0.1:26525 (tcp)
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.852690 [INFO] agent: Stopping DNS server 127.0.0.1:26525 (udp)
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.852889 [INFO] agent: Stopping HTTP server 127.0.0.1:26526 (tcp)
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.853638 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Base64 - 2019/12/30 19:08:42.853733 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Base64 (2.68s)
TestKVPutCommand - 2019/12/30 19:08:42.854657 [INFO] manager: shutting down
TestKVPutCommand - 2019/12/30 19:08:43.092993 [INFO] agent: consul server down
TestKVPutCommand - 2019/12/30 19:08:43.093075 [INFO] agent: shutdown complete
TestKVPutCommand - 2019/12/30 19:08:43.093139 [INFO] agent: Stopping DNS server 127.0.0.1:26537 (tcp)
TestKVPutCommand - 2019/12/30 19:08:43.093271 [INFO] agent: Stopping DNS server 127.0.0.1:26537 (udp)
TestKVPutCommand - 2019/12/30 19:08:43.093435 [INFO] agent: Stopping HTTP server 127.0.0.1:26538 (tcp)
TestKVPutCommand - 2019/12/30 19:08:43.094025 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand - 2019/12/30 19:08:43.094137 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand - 2019/12/30 19:08:43.094172 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand (2.65s)
PASS
ok  	github.com/hashicorp/consul/command/kv/put	5.622s
=== RUN   TestLeaveCommand_noTabs
=== PAUSE TestLeaveCommand_noTabs
=== RUN   TestLeaveCommand
=== PAUSE TestLeaveCommand
=== RUN   TestLeaveCommand_FailOnNonFlagArgs
=== PAUSE TestLeaveCommand_FailOnNonFlagArgs
=== CONT  TestLeaveCommand_noTabs
=== CONT  TestLeaveCommand_FailOnNonFlagArgs
--- PASS: TestLeaveCommand_noTabs (0.00s)
=== CONT  TestLeaveCommand
WARNING: bootstrap = true: do not enable unless necessary
TestLeaveCommand - 2019/12/30 19:08:41.185137 [WARN] agent: Node name "Node 2a3f6b00-ba38-2db8-0870-7525976433f2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLeaveCommand - 2019/12/30 19:08:41.186046 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:41.200808 [WARN] agent: Node name "Node 56c96894-d5d5-d072-f807-95d0b1677813" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:41.204582 [DEBUG] tlsutil: Update with version 1
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:41.211118 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLeaveCommand - 2019/12/30 19:08:41.211875 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:08:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:56c96894-d5d5-d072-f807-95d0b1677813 Address:127.0.0.1:17506}]
2019/12/30 19:08:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2a3f6b00-ba38-2db8-0870-7525976433f2 Address:127.0.0.1:17512}]
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:17512 [Follower] entering Follower state (Leader: "")
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.265046 [INFO] serf: EventMemberJoin: Node 56c96894-d5d5-d072-f807-95d0b1677813.dc1 127.0.0.1
TestLeaveCommand - 2019/12/30 19:08:42.266060 [INFO] serf: EventMemberJoin: Node 2a3f6b00-ba38-2db8-0870-7525976433f2.dc1 127.0.0.1
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.270607 [INFO] serf: EventMemberJoin: Node 56c96894-d5d5-d072-f807-95d0b1677813 127.0.0.1
TestLeaveCommand - 2019/12/30 19:08:42.270607 [INFO] serf: EventMemberJoin: Node 2a3f6b00-ba38-2db8-0870-7525976433f2 127.0.0.1
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.272715 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestLeaveCommand - 2019/12/30 19:08:42.276006 [INFO] consul: Adding LAN server Node 2a3f6b00-ba38-2db8-0870-7525976433f2 (Addr: tcp/127.0.0.1:17512) (DC: dc1)
TestLeaveCommand - 2019/12/30 19:08:42.285066 [INFO] agent: Started DNS server 127.0.0.1:17507 (udp)
TestLeaveCommand - 2019/12/30 19:08:42.285822 [INFO] consul: Handled member-join event for server "Node 2a3f6b00-ba38-2db8-0870-7525976433f2.dc1" in area "wan"
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.286597 [INFO] consul: Adding LAN server Node 56c96894-d5d5-d072-f807-95d0b1677813 (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.288474 [INFO] consul: Handled member-join event for server "Node 56c96894-d5d5-d072-f807-95d0b1677813.dc1" in area "wan"
TestLeaveCommand - 2019/12/30 19:08:42.289698 [INFO] agent: Started DNS server 127.0.0.1:17507 (tcp)
TestLeaveCommand - 2019/12/30 19:08:42.294771 [INFO] agent: Started HTTP server on 127.0.0.1:17508 (tcp)
TestLeaveCommand - 2019/12/30 19:08:42.295074 [INFO] agent: started state syncer
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.298293 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.300669 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.300785 [INFO] agent: started state syncer
2019/12/30 19:08:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:17512 [Candidate] entering Candidate state in term 2
2019/12/30 19:08:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:17512 [Leader] entering Leader state
TestLeaveCommand - 2019/12/30 19:08:42.926652 [INFO] consul: cluster leadership acquired
TestLeaveCommand - 2019/12/30 19:08:42.927271 [INFO] consul: New leader elected: Node 2a3f6b00-ba38-2db8-0870-7525976433f2
2019/12/30 19:08:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:08:42 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.928849 [INFO] consul: cluster leadership acquired
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:42.929302 [INFO] consul: New leader elected: Node 56c96894-d5d5-d072-f807-95d0b1677813
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.269634 [INFO] agent: Requesting shutdown
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.269748 [INFO] consul: shutting down server
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.269805 [WARN] serf: Shutdown without a Leave
TestLeaveCommand - 2019/12/30 19:08:43.294853 [INFO] agent: Synced node info
TestLeaveCommand - 2019/12/30 19:08:43.294984 [DEBUG] agent: Node info in sync
TestLeaveCommand - 2019/12/30 19:08:43.314486 [INFO] consul: server starting leave
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.434548 [WARN] serf: Shutdown without a Leave
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.526213 [INFO] manager: shutting down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.526691 [INFO] agent: consul server down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.526757 [INFO] agent: shutdown complete
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.526816 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.526972 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.526998 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.527277 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.527536 [INFO] agent: Waiting for endpoints to shut down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.527553 [ERR] consul: failed to establish leadership: raft is already shutdown
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.527652 [INFO] agent: Endpoints down
--- PASS: TestLeaveCommand_FailOnNonFlagArgs (2.43s)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/30 19:08:43.527777 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestLeaveCommand - 2019/12/30 19:08:43.610034 [INFO] serf: EventMemberLeave: Node 2a3f6b00-ba38-2db8-0870-7525976433f2.dc1 127.0.0.1
TestLeaveCommand - 2019/12/30 19:08:43.610400 [INFO] consul: Handled member-leave event for server "Node 2a3f6b00-ba38-2db8-0870-7525976433f2.dc1" in area "wan"
TestLeaveCommand - 2019/12/30 19:08:43.610470 [INFO] manager: shutting down
TestLeaveCommand - 2019/12/30 19:08:44.368627 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLeaveCommand - 2019/12/30 19:08:44.369169 [DEBUG] consul: Skipping self join check for "Node 2a3f6b00-ba38-2db8-0870-7525976433f2" since the cluster is too small
TestLeaveCommand - 2019/12/30 19:08:44.369331 [INFO] consul: member 'Node 2a3f6b00-ba38-2db8-0870-7525976433f2' joined, marking health alive
TestLeaveCommand - 2019/12/30 19:08:45.494505 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLeaveCommand - 2019/12/30 19:08:45.494600 [DEBUG] agent: Node info in sync
TestLeaveCommand - 2019/12/30 19:08:45.613603 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLeaveCommand - 2019/12/30 19:08:46.610688 [INFO] serf: EventMemberLeave: Node 2a3f6b00-ba38-2db8-0870-7525976433f2 127.0.0.1
TestLeaveCommand - 2019/12/30 19:08:46.611022 [INFO] consul: Removing LAN server Node 2a3f6b00-ba38-2db8-0870-7525976433f2 (Addr: tcp/127.0.0.1:17512) (DC: dc1)
TestLeaveCommand - 2019/12/30 19:08:46.611222 [WARN] consul: deregistering self (Node 2a3f6b00-ba38-2db8-0870-7525976433f2) should be done by follower
TestLeaveCommand - 2019/12/30 19:08:47.611016 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/30 19:08:49.611038 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/30 19:08:49.611390 [INFO] consul: Waiting 5s to drain RPC traffic
TestLeaveCommand - 2019/12/30 19:08:51.611122 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/30 19:08:53.615714 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/30 19:08:53.615726 [ERR] autopilot: Error promoting servers: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/30 19:08:54.611753 [INFO] agent: Requesting shutdown
TestLeaveCommand - 2019/12/30 19:08:54.611891 [INFO] consul: shutting down server
TestLeaveCommand - 2019/12/30 19:08:54.743657 [INFO] agent: consul server down
TestLeaveCommand - 2019/12/30 19:08:54.743734 [INFO] agent: shutdown complete
TestLeaveCommand - 2019/12/30 19:08:54.743833 [DEBUG] http: Request PUT /v1/agent/leave (11.429325702s) from=127.0.0.1:45218
TestLeaveCommand - 2019/12/30 19:08:54.744475 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (tcp)
TestLeaveCommand - 2019/12/30 19:08:54.744648 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (udp)
TestLeaveCommand - 2019/12/30 19:08:54.744799 [INFO] agent: Stopping HTTP server 127.0.0.1:17508 (tcp)
TestLeaveCommand - 2019/12/30 19:08:54.745214 [INFO] agent: Waiting for endpoints to shut down
TestLeaveCommand - 2019/12/30 19:08:54.745319 [INFO] agent: Endpoints down
--- PASS: TestLeaveCommand (13.65s)
PASS
ok  	github.com/hashicorp/consul/command/leave	13.922s
=== RUN   TestLockCommand_noTabs
=== PAUSE TestLockCommand_noTabs
=== RUN   TestLockCommand_BadArgs
=== PAUSE TestLockCommand_BadArgs
=== RUN   TestLockCommand
=== PAUSE TestLockCommand
=== RUN   TestLockCommand_NoShell
=== PAUSE TestLockCommand_NoShell
=== RUN   TestLockCommand_TryLock
=== PAUSE TestLockCommand_TryLock
=== RUN   TestLockCommand_TrySemaphore
=== PAUSE TestLockCommand_TrySemaphore
=== RUN   TestLockCommand_MonitorRetry_Lock_Default
=== PAUSE TestLockCommand_MonitorRetry_Lock_Default
=== RUN   TestLockCommand_MonitorRetry_Semaphore_Default
=== PAUSE TestLockCommand_MonitorRetry_Semaphore_Default
=== RUN   TestLockCommand_MonitorRetry_Lock_Arg
=== PAUSE TestLockCommand_MonitorRetry_Lock_Arg
=== RUN   TestLockCommand_MonitorRetry_Semaphore_Arg
=== PAUSE TestLockCommand_MonitorRetry_Semaphore_Arg
=== RUN   TestLockCommand_ChildExitCode
--- SKIP: TestLockCommand_ChildExitCode (0.00s)
    lock_test.go:302: DM-skipped
=== CONT  TestLockCommand_noTabs
=== CONT  TestLockCommand_MonitorRetry_Lock_Default
=== CONT  TestLockCommand_MonitorRetry_Semaphore_Arg
=== CONT  TestLockCommand_MonitorRetry_Lock_Arg
--- PASS: TestLockCommand_noTabs (0.01s)
=== CONT  TestLockCommand_TrySemaphore
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:12.719700 [WARN] agent: Node name "Node 4298ac64-d328-c22c-ce3a-349a4f153af1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:12.721920 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:12.736081 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:12.742124 [WARN] agent: Node name "Node c56d2f85-2efd-5091-d852-95be9dd8fa28" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:12.743124 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:12.747573 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_TrySemaphore - 2019/12/30 19:09:12.755284 [WARN] agent: Node name "Node b7e35b64-1b70-9b82-a562-b4eb244e91e0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_TrySemaphore - 2019/12/30 19:09:12.755917 [DEBUG] tlsutil: Update with version 1
TestLockCommand_TrySemaphore - 2019/12/30 19:09:12.771222 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:12.788932 [WARN] agent: Node name "Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:12.789795 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:12.792005 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:09:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4298ac64-d328-c22c-ce3a-349a4f153af1 Address:127.0.0.1:16018}]
2019/12/30 19:09:13 [INFO]  raft: Node at 127.0.0.1:16018 [Follower] entering Follower state (Leader: "")
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.907269 [INFO] serf: EventMemberJoin: Node 4298ac64-d328-c22c-ce3a-349a4f153af1.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.911128 [INFO] serf: EventMemberJoin: Node 4298ac64-d328-c22c-ce3a-349a4f153af1 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.912607 [INFO] consul: Adding LAN server Node 4298ac64-d328-c22c-ce3a-349a4f153af1 (Addr: tcp/127.0.0.1:16018) (DC: dc1)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.912774 [INFO] consul: Handled member-join event for server "Node 4298ac64-d328-c22c-ce3a-349a4f153af1.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.913432 [INFO] agent: Started DNS server 127.0.0.1:16013 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.913746 [INFO] agent: Started DNS server 127.0.0.1:16013 (udp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.916100 [INFO] agent: Started HTTP server on 127.0.0.1:16014 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:13.916247 [INFO] agent: started state syncer
2019/12/30 19:09:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:13 [INFO]  raft: Node at 127.0.0.1:16018 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b7e35b64-1b70-9b82-a562-b4eb244e91e0 Address:127.0.0.1:16024}]
2019/12/30 19:09:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c56d2f85-2efd-5091-d852-95be9dd8fa28 Address:127.0.0.1:16006}]
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16024 [Follower] entering Follower state (Leader: "")
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16006 [Follower] entering Follower state (Leader: "")
2019/12/30 19:09:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c Address:127.0.0.1:16012}]
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16012 [Follower] entering Follower state (Leader: "")
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.163992 [INFO] serf: EventMemberJoin: Node c56d2f85-2efd-5091-d852-95be9dd8fa28.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.164239 [INFO] serf: EventMemberJoin: Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c.dc1 127.0.0.1
2019/12/30 19:09:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16024 [Candidate] entering Candidate state in term 2
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.208384 [INFO] serf: EventMemberJoin: Node b7e35b64-1b70-9b82-a562-b4eb244e91e0.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.209883 [INFO] serf: EventMemberJoin: Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.212411 [INFO] serf: EventMemberJoin: Node c56d2f85-2efd-5091-d852-95be9dd8fa28 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.212624 [INFO] agent: Started DNS server 127.0.0.1:16007 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.214122 [INFO] agent: Started DNS server 127.0.0.1:16001 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.223376 [INFO] consul: Handled member-join event for server "Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.224904 [INFO] consul: Adding LAN server Node c56d2f85-2efd-5091-d852-95be9dd8fa28 (Addr: tcp/127.0.0.1:16006) (DC: dc1)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.229242 [INFO] serf: EventMemberJoin: Node b7e35b64-1b70-9b82-a562-b4eb244e91e0 127.0.0.1
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.231043 [INFO] agent: Started DNS server 127.0.0.1:16019 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.232232 [INFO] consul: Adding LAN server Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c (Addr: tcp/127.0.0.1:16012) (DC: dc1)
2019/12/30 19:09:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16006 [Candidate] entering Candidate state in term 2
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.240646 [INFO] agent: Started DNS server 127.0.0.1:16007 (tcp)
2019/12/30 19:09:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16012 [Candidate] entering Candidate state in term 2
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.247039 [INFO] agent: Started HTTP server on 127.0.0.1:16008 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.247265 [INFO] agent: started state syncer
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.242156 [INFO] consul: Handled member-join event for server "Node c56d2f85-2efd-5091-d852-95be9dd8fa28.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.242506 [INFO] agent: Started DNS server 127.0.0.1:16001 (tcp)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.240230 [INFO] consul: Handled member-join event for server "Node b7e35b64-1b70-9b82-a562-b4eb244e91e0.dc1" in area "wan"
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.241203 [INFO] agent: Started DNS server 127.0.0.1:16019 (tcp)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.252025 [INFO] agent: Started HTTP server on 127.0.0.1:16020 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.253756 [INFO] agent: Started HTTP server on 127.0.0.1:16002 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.254112 [INFO] agent: started state syncer
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.242924 [INFO] consul: Adding LAN server Node b7e35b64-1b70-9b82-a562-b4eb244e91e0 (Addr: tcp/127.0.0.1:16024) (DC: dc1)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.254946 [INFO] agent: started state syncer
2019/12/30 19:09:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16018 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:14.496645 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:14.497260 [INFO] consul: New leader elected: Node 4298ac64-d328-c22c-ce3a-349a4f153af1
2019/12/30 19:09:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16006 [Leader] entering Leader state
2019/12/30 19:09:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:14 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16024 [Leader] entering Leader state
2019/12/30 19:09:14 [INFO]  raft: Node at 127.0.0.1:16012 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.743770 [INFO] consul: cluster leadership acquired
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.745865 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:14.746389 [INFO] consul: New leader elected: Node c56d2f85-2efd-5091-d852-95be9dd8fa28
TestLockCommand_TrySemaphore - 2019/12/30 19:09:14.746460 [INFO] consul: New leader elected: Node b7e35b64-1b70-9b82-a562-b4eb244e91e0
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.746627 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:14.746958 [INFO] consul: New leader elected: Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:15.045854 [INFO] agent: Synced node info
TestLockCommand_TrySemaphore - 2019/12/30 19:09:15.144658 [INFO] agent: Synced node info
TestLockCommand_TrySemaphore - 2019/12/30 19:09:15.144767 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:15.146781 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:15.149328 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:15.587389 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:15.587507 [DEBUG] agent: Node info in sync
TestLockCommand_TrySemaphore - 2019/12/30 19:09:16.010640 [DEBUG] agent: Node info in sync
TestLockCommand_TrySemaphore - 2019/12/30 19:09:16.366167 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_TrySemaphore - 2019/12/30 19:09:16.366692 [DEBUG] consul: Skipping self join check for "Node b7e35b64-1b70-9b82-a562-b4eb244e91e0" since the cluster is too small
TestLockCommand_TrySemaphore - 2019/12/30 19:09:16.366874 [INFO] consul: member 'Node b7e35b64-1b70-9b82-a562-b4eb244e91e0' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:16.477844 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:16.478383 [DEBUG] consul: Skipping self join check for "Node 4298ac64-d328-c22c-ce3a-349a4f153af1" since the cluster is too small
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:16.478568 [INFO] consul: member 'Node 4298ac64-d328-c22c-ce3a-349a4f153af1' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:16.569214 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:16.569854 [DEBUG] consul: Skipping self join check for "Node c56d2f85-2efd-5091-d852-95be9dd8fa28" since the cluster is too small
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:16.570037 [INFO] consul: member 'Node c56d2f85-2efd-5091-d852-95be9dd8fa28' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:16.680051 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:16.680537 [DEBUG] consul: Skipping self join check for "Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c" since the cluster is too small
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:16.680713 [INFO] consul: member 'Node 6d9e6d73-b72d-0232-fb0a-bc3d74f1bf6c' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:16.741591 [DEBUG] http: Request GET /v1/agent/self (35.973306ms) from=127.0.0.1:58600
TestLockCommand_TrySemaphore - 2019/12/30 19:09:16.806151 [DEBUG] http: Request GET /v1/agent/self (59.393606ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:16.852632 [DEBUG] http: Request GET /v1/agent/self (7.26553ms) from=127.0.0.1:41466
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:17.039781 [DEBUG] http: Request GET /v1/agent/self (10.049939ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:17.387566 [DEBUG] http: Request PUT /v1/session/create (586.072855ms) from=127.0.0.1:58600
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.388808 [DEBUG] http: Request PUT /v1/session/create (524.069177ms) from=127.0.0.1:41466
TestLockCommand_TrySemaphore - 2019/12/30 19:09:17.390216 [DEBUG] http: Request PUT /v1/session/create (544.743404ms) from=127.0.0.1:60812
TestLockCommand_TrySemaphore - 2019/12/30 19:09:17.391688 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:17.394528 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.394956 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (289.008µs) from=127.0.0.1:41466
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:17.399221 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (318.675µs) from=127.0.0.1:58600
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:17.510538 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:17.513389 [DEBUG] http: Request PUT /v1/session/create (456.015003ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:17.712429 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=e74d687e-7dd6-9978-1a69-812f49663f43&flags=3304740253564472344 (311.465426ms) from=127.0.0.1:58600
TestLockCommand_TrySemaphore - 2019/12/30 19:09:17.715005 [DEBUG] http: Request PUT /v1/kv/test/prefix/3efa8501-17dd-4fa6-bc28-91d5aeb8e2d3?acquire=3efa8501-17dd-4fa6-bc28-91d5aeb8e2d3&flags=16210313421097356768 (309.373369ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.723055 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:17.727383 [DEBUG] http: Request PUT /v1/kv/test/prefix/3af2b63f-8faa-606a-d4e9-7aa0191a1568?acquire=3af2b63f-8faa-606a-d4e9-7aa0191a1568&flags=16210313421097356768 (211.165046ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.729325 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=ed6278a3-1c51-26c5-7363-6015338a1ccb&flags=3304740253564472344 (332.590997ms) from=127.0.0.1:41466
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:17.743783 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (20.123212ms) from=127.0.0.1:58600
TestLockCommand_TrySemaphore - 2019/12/30 19:09:17.743815 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=10000ms (6.521177ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.746764 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.746900 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.747130 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:17.752844 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=15000ms (1.474373ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:17.766148 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (18.034822ms) from=127.0.0.1:41466
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.028804 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (271.651015ms) from=127.0.0.1:50740
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.029703 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (270.660656ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.028856 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=e74d687e-7dd6-9978-1a69-812f49663f43 (181.092233ms) from=127.0.0.1:58608
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.039006 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=ed6278a3-1c51-26c5-7363-6015338a1ccb (192.071196ms) from=127.0.0.1:41470
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.041501 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (291.327215ms) from=127.0.0.1:58600
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.047975 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (9.501924ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.050723 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (6.294837ms) from=127.0.0.1:58608
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.055053 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.062656 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.062768 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.087096 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (313.315143ms) from=127.0.0.1:41466
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.103452 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (897.357µs) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.113286 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.837716ms) from=127.0.0.1:41470
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.126279 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (12.025325ms) from=127.0.0.1:50750
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.136733 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (8.831906ms) from=127.0.0.1:60812
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.345357 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (205.059214ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.347653 [DEBUG] http: Request PUT /v1/session/destroy/e74d687e-7dd6-9978-1a69-812f49663f43 (238.946464ms) from=127.0.0.1:58610
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.347784 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (216.293518ms) from=127.0.0.1:60830
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.350410 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (216.48619ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.352260 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (218.386574ms) from=127.0.0.1:50750
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.420844 [DEBUG] http: Request PUT /v1/session/destroy/ed6278a3-1c51-26c5-7363-6015338a1ccb (303.910222ms) from=127.0.0.1:41476
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.422004 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (291.681224ms) from=127.0.0.1:41470
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.423782 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.423902 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.424115 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.571976 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (461.196143ms) from=127.0.0.1:58608
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.574122 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.574234 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.574324 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.577699 [DEBUG] http: Request DELETE /v1/kv/test/prefix/3efa8501-17dd-4fa6-bc28-91d5aeb8e2d3 (225.159425ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.582240 [DEBUG] http: Request DELETE /v1/kv/test/prefix/3af2b63f-8faa-606a-d4e9-7aa0191a1568 (226.744134ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.591289 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:18.595641 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (1.695713ms) from=127.0.0.1:50740
TestLockCommand_TrySemaphore - 2019/12/30 19:09:18.638258 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (52.013074ms) from=127.0.0.1:60812
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.910124 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.976991 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.977834 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.977908 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.977965 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.978115 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.978271 [INFO] agent: Stopping HTTP server 127.0.0.1:16002 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.979115 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.982075 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.982727 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.982781 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.982833 [INFO] agent: Stopping DNS server 127.0.0.1:16013 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.982983 [INFO] agent: Stopping DNS server 127.0.0.1:16013 (udp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.983136 [INFO] agent: Stopping HTTP server 127.0.0.1:16014 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.983839 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/30 19:09:18.984051 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Lock_Arg (6.43s)
=== CONT  TestLockCommand_TryLock
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/30 19:09:18.979374 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Lock_Default (6.44s)
=== CONT  TestLockCommand_MonitorRetry_Semaphore_Default
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.046963 [DEBUG] http: Request PUT /v1/session/destroy/3efa8501-17dd-4fa6-bc28-91d5aeb8e2d3 (463.009858ms) from=127.0.0.1:60830
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:19.107550 [WARN] agent: Node name "Node 06584732-7b1c-c12f-ab73-9a7276deb67d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:19.108026 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:19.119586 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_TryLock - 2019/12/30 19:09:19.136094 [WARN] agent: Node name "Node 3fce67b3-545e-ad4f-91ac-8613c66241fb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_TryLock - 2019/12/30 19:09:19.136482 [DEBUG] tlsutil: Update with version 1
TestLockCommand_TryLock - 2019/12/30 19:09:19.138667 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.362168 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (712.175598ms) from=127.0.0.1:60812
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.364114 [INFO] agent: Requesting shutdown
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.364222 [INFO] consul: shutting down server
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.364287 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.503352 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.504539 [DEBUG] http: Request PUT /v1/session/destroy/3af2b63f-8faa-606a-d4e9-7aa0191a1568 (916.075781ms) from=127.0.0.1:50750
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.505194 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (898.516305ms) from=127.0.0.1:50740
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.506528 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.506623 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.506672 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.601918 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.602986 [INFO] manager: shutting down
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.603653 [INFO] agent: consul server down
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.603714 [INFO] agent: shutdown complete
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.603772 [INFO] agent: Stopping DNS server 127.0.0.1:16019 (tcp)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.603910 [INFO] agent: Stopping DNS server 127.0.0.1:16019 (udp)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:19.604071 [INFO] agent: Stopping HTTP server 127.0.0.1:16020 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.693662 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.694378 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.694535 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.694589 [INFO] agent: Stopping DNS server 127.0.0.1:16007 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.694745 [INFO] agent: Stopping DNS server 127.0.0.1:16007 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.694895 [INFO] agent: Stopping HTTP server 127.0.0.1:16008 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.695501 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/30 19:09:19.695723 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Semaphore_Arg (7.14s)
=== CONT  TestLockCommand
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand - 2019/12/30 19:09:19.811905 [WARN] agent: Node name "Node 56f5f94b-5073-7333-1663-5a5835b03b16" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand - 2019/12/30 19:09:19.812631 [DEBUG] tlsutil: Update with version 1
TestLockCommand - 2019/12/30 19:09:19.815168 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:09:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:06584732-7b1c-c12f-ab73-9a7276deb67d Address:127.0.0.1:16036}]
2019/12/30 19:09:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3fce67b3-545e-ad4f-91ac-8613c66241fb Address:127.0.0.1:16030}]
2019/12/30 19:09:20 [INFO]  raft: Node at 127.0.0.1:16036 [Follower] entering Follower state (Leader: "")
2019/12/30 19:09:20 [INFO]  raft: Node at 127.0.0.1:16030 [Follower] entering Follower state (Leader: "")
TestLockCommand_TryLock - 2019/12/30 19:09:20.291703 [INFO] serf: EventMemberJoin: Node 3fce67b3-545e-ad4f-91ac-8613c66241fb.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.295540 [INFO] serf: EventMemberJoin: Node 06584732-7b1c-c12f-ab73-9a7276deb67d.dc1 127.0.0.1
TestLockCommand_TryLock - 2019/12/30 19:09:20.298718 [INFO] serf: EventMemberJoin: Node 3fce67b3-545e-ad4f-91ac-8613c66241fb 127.0.0.1
TestLockCommand_TryLock - 2019/12/30 19:09:20.299901 [INFO] consul: Adding LAN server Node 3fce67b3-545e-ad4f-91ac-8613c66241fb (Addr: tcp/127.0.0.1:16030) (DC: dc1)
TestLockCommand_TryLock - 2019/12/30 19:09:20.300206 [INFO] consul: Handled member-join event for server "Node 3fce67b3-545e-ad4f-91ac-8613c66241fb.dc1" in area "wan"
TestLockCommand_TryLock - 2019/12/30 19:09:20.300723 [INFO] agent: Started DNS server 127.0.0.1:16025 (udp)
TestLockCommand_TryLock - 2019/12/30 19:09:20.300789 [INFO] agent: Started DNS server 127.0.0.1:16025 (tcp)
TestLockCommand_TryLock - 2019/12/30 19:09:20.303307 [INFO] agent: Started HTTP server on 127.0.0.1:16026 (tcp)
TestLockCommand_TryLock - 2019/12/30 19:09:20.303383 [INFO] agent: started state syncer
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.304549 [INFO] serf: EventMemberJoin: Node 06584732-7b1c-c12f-ab73-9a7276deb67d 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.306461 [INFO] consul: Adding LAN server Node 06584732-7b1c-c12f-ab73-9a7276deb67d (Addr: tcp/127.0.0.1:16036) (DC: dc1)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.309744 [INFO] consul: Handled member-join event for server "Node 06584732-7b1c-c12f-ab73-9a7276deb67d.dc1" in area "wan"
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.310601 [INFO] agent: Started DNS server 127.0.0.1:16031 (udp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.310675 [INFO] agent: Started DNS server 127.0.0.1:16031 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.318179 [INFO] agent: Started HTTP server on 127.0.0.1:16032 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:20.318298 [INFO] agent: started state syncer
2019/12/30 19:09:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:20 [INFO]  raft: Node at 127.0.0.1:16030 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:20 [INFO]  raft: Node at 127.0.0.1:16036 [Candidate] entering Candidate state in term 2
TestLockCommand_TrySemaphore - 2019/12/30 19:09:20.604533 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:16020 (tcp)
TestLockCommand_TrySemaphore - 2019/12/30 19:09:20.604613 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TrySemaphore - 2019/12/30 19:09:20.604656 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_TrySemaphore (8.04s)
=== CONT  TestLockCommand_NoShell
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_NoShell - 2019/12/30 19:09:20.715105 [WARN] agent: Node name "Node d4f6ecd3-04fe-52c9-c245-625113f61ca4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_NoShell - 2019/12/30 19:09:20.715522 [DEBUG] tlsutil: Update with version 1
TestLockCommand_NoShell - 2019/12/30 19:09:20.717913 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:09:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:56f5f94b-5073-7333-1663-5a5835b03b16 Address:127.0.0.1:16042}]
2019/12/30 19:09:20 [INFO]  raft: Node at 127.0.0.1:16042 [Follower] entering Follower state (Leader: "")
TestLockCommand - 2019/12/30 19:09:20.782368 [INFO] serf: EventMemberJoin: Node 56f5f94b-5073-7333-1663-5a5835b03b16.dc1 127.0.0.1
TestLockCommand - 2019/12/30 19:09:20.786365 [INFO] serf: EventMemberJoin: Node 56f5f94b-5073-7333-1663-5a5835b03b16 127.0.0.1
TestLockCommand - 2019/12/30 19:09:20.787604 [INFO] consul: Adding LAN server Node 56f5f94b-5073-7333-1663-5a5835b03b16 (Addr: tcp/127.0.0.1:16042) (DC: dc1)
TestLockCommand - 2019/12/30 19:09:20.787884 [INFO] consul: Handled member-join event for server "Node 56f5f94b-5073-7333-1663-5a5835b03b16.dc1" in area "wan"
TestLockCommand - 2019/12/30 19:09:20.788961 [INFO] agent: Started DNS server 127.0.0.1:16037 (tcp)
TestLockCommand - 2019/12/30 19:09:20.789361 [INFO] agent: Started DNS server 127.0.0.1:16037 (udp)
TestLockCommand - 2019/12/30 19:09:20.792181 [INFO] agent: Started HTTP server on 127.0.0.1:16038 (tcp)
TestLockCommand - 2019/12/30 19:09:20.792281 [INFO] agent: started state syncer
2019/12/30 19:09:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:20 [INFO]  raft: Node at 127.0.0.1:16042 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:21 [INFO]  raft: Node at 127.0.0.1:16030 [Leader] entering Leader state
TestLockCommand_TryLock - 2019/12/30 19:09:21.102445 [INFO] consul: cluster leadership acquired
TestLockCommand_TryLock - 2019/12/30 19:09:21.102952 [INFO] consul: New leader elected: Node 3fce67b3-545e-ad4f-91ac-8613c66241fb
2019/12/30 19:09:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:21 [INFO]  raft: Node at 127.0.0.1:16036 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:21.105026 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:21.105483 [INFO] consul: New leader elected: Node 06584732-7b1c-c12f-ab73-9a7276deb67d
2019/12/30 19:09:21 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:21 [INFO]  raft: Node at 127.0.0.1:16042 [Leader] entering Leader state
TestLockCommand - 2019/12/30 19:09:21.744847 [INFO] consul: cluster leadership acquired
TestLockCommand - 2019/12/30 19:09:21.745529 [INFO] consul: New leader elected: Node 56f5f94b-5073-7333-1663-5a5835b03b16
TestLockCommand_TryLock - 2019/12/30 19:09:21.746683 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:22.354769 [INFO] agent: Synced node info
2019/12/30 19:09:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d4f6ecd3-04fe-52c9-c245-625113f61ca4 Address:127.0.0.1:16048}]
TestLockCommand - 2019/12/30 19:09:22.545445 [INFO] agent: Synced node info
TestLockCommand - 2019/12/30 19:09:22.545561 [DEBUG] agent: Node info in sync
TestLockCommand_NoShell - 2019/12/30 19:09:22.547931 [INFO] serf: EventMemberJoin: Node d4f6ecd3-04fe-52c9-c245-625113f61ca4.dc1 127.0.0.1
2019/12/30 19:09:22 [INFO]  raft: Node at 127.0.0.1:16048 [Follower] entering Follower state (Leader: "")
TestLockCommand_NoShell - 2019/12/30 19:09:22.555959 [INFO] serf: EventMemberJoin: Node d4f6ecd3-04fe-52c9-c245-625113f61ca4 127.0.0.1
TestLockCommand_NoShell - 2019/12/30 19:09:22.556885 [INFO] consul: Handled member-join event for server "Node d4f6ecd3-04fe-52c9-c245-625113f61ca4.dc1" in area "wan"
TestLockCommand_NoShell - 2019/12/30 19:09:22.557244 [INFO] consul: Adding LAN server Node d4f6ecd3-04fe-52c9-c245-625113f61ca4 (Addr: tcp/127.0.0.1:16048) (DC: dc1)
TestLockCommand_NoShell - 2019/12/30 19:09:22.557358 [INFO] agent: Started DNS server 127.0.0.1:16043 (udp)
TestLockCommand_NoShell - 2019/12/30 19:09:22.557756 [INFO] agent: Started DNS server 127.0.0.1:16043 (tcp)
TestLockCommand_NoShell - 2019/12/30 19:09:22.560155 [INFO] agent: Started HTTP server on 127.0.0.1:16044 (tcp)
TestLockCommand_NoShell - 2019/12/30 19:09:22.560265 [INFO] agent: started state syncer
2019/12/30 19:09:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:22 [INFO]  raft: Node at 127.0.0.1:16048 [Candidate] entering Candidate state in term 2
TestLockCommand_TryLock - 2019/12/30 19:09:22.767252 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/12/30 19:09:22.767364 [DEBUG] agent: Node info in sync
2019/12/30 19:09:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:23 [INFO]  raft: Node at 127.0.0.1:16048 [Leader] entering Leader state
TestLockCommand_NoShell - 2019/12/30 19:09:23.196749 [INFO] consul: cluster leadership acquired
TestLockCommand_NoShell - 2019/12/30 19:09:23.197165 [INFO] consul: New leader elected: Node d4f6ecd3-04fe-52c9-c245-625113f61ca4
TestLockCommand_TryLock - 2019/12/30 19:09:23.289972 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_TryLock - 2019/12/30 19:09:23.290509 [DEBUG] consul: Skipping self join check for "Node 3fce67b3-545e-ad4f-91ac-8613c66241fb" since the cluster is too small
TestLockCommand_TryLock - 2019/12/30 19:09:23.290708 [INFO] consul: member 'Node 3fce67b3-545e-ad4f-91ac-8613c66241fb' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.469469 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.469924 [DEBUG] consul: Skipping self join check for "Node 06584732-7b1c-c12f-ab73-9a7276deb67d" since the cluster is too small
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.470091 [INFO] consul: member 'Node 06584732-7b1c-c12f-ab73-9a7276deb67d' joined, marking health alive
TestLockCommand_TryLock - 2019/12/30 19:09:23.514178 [DEBUG] http: Request GET /v1/agent/self (5.738488ms) from=127.0.0.1:56070
TestLockCommand_NoShell - 2019/12/30 19:09:23.594625 [INFO] agent: Synced node info
TestLockCommand - 2019/12/30 19:09:23.737927 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand - 2019/12/30 19:09:23.738440 [DEBUG] consul: Skipping self join check for "Node 56f5f94b-5073-7333-1663-5a5835b03b16" since the cluster is too small
TestLockCommand - 2019/12/30 19:09:23.738615 [INFO] consul: member 'Node 56f5f94b-5073-7333-1663-5a5835b03b16' joined, marking health alive
TestLockCommand_TryLock - 2019/12/30 19:09:23.743554 [DEBUG] http: Request PUT /v1/session/create (218.081898ms) from=127.0.0.1:56070
TestLockCommand_TryLock - 2019/12/30 19:09:23.750265 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=10000ms (231.007µs) from=127.0.0.1:56070
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.775873 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.775977 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.776053 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:23.782023 [DEBUG] http: Request GET /v1/agent/self (6.018829ms) from=127.0.0.1:48706
TestLockCommand_TryLock - 2019/12/30 19:09:23.928445 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=02f3df2e-7474-9998-68c9-03c7eda0b24f&flags=3304740253564472344 (176.319768ms) from=127.0.0.1:56070
TestLockCommand_TryLock - 2019/12/30 19:09:23.935021 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (963.693µs) from=127.0.0.1:56070
TestLockCommand - 2019/12/30 19:09:23.973527 [DEBUG] http: Request GET /v1/agent/self (9.739931ms) from=127.0.0.1:47832
TestLockCommand_NoShell - 2019/12/30 19:09:23.973755 [DEBUG] agent: Node info in sync
TestLockCommand_NoShell - 2019/12/30 19:09:23.973859 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.050698 [DEBUG] http: Request PUT /v1/session/create (257.49063ms) from=127.0.0.1:48706
TestLockCommand_TryLock - 2019/12/30 19:09:24.195224 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=02f3df2e-7474-9998-68c9-03c7eda0b24f (226.782133ms) from=127.0.0.1:56076
TestLockCommand - 2019/12/30 19:09:24.196625 [DEBUG] http: Request PUT /v1/session/create (211.901731ms) from=127.0.0.1:47832
TestLockCommand_TryLock - 2019/12/30 19:09:24.198043 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (252.767836ms) from=127.0.0.1:56070
TestLockCommand_TryLock - 2019/12/30 19:09:24.201293 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (998.027µs) from=127.0.0.1:56076
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.203763 [DEBUG] http: Request PUT /v1/kv/test/prefix/1d06be19-1ede-a9a5-1fb1-ff5a3126b973?acquire=1d06be19-1ede-a9a5-1fb1-ff5a3126b973&flags=16210313421097356768 (150.643074ms) from=127.0.0.1:48706
TestLockCommand - 2019/12/30 19:09:24.205489 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (322.676µs) from=127.0.0.1:47832
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.214006 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=15000ms (1.286701ms) from=127.0.0.1:48706
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.470072 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (253.201514ms) from=127.0.0.1:48706
TestLockCommand - 2019/12/30 19:09:24.471564 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=62d2e473-8e2a-5661-495b-72ea2184de29&flags=3304740253564472344 (264.568821ms) from=127.0.0.1:47832
TestLockCommand_TryLock - 2019/12/30 19:09:24.493271 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_TryLock - 2019/12/30 19:09:24.500772 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (293.691942ms) from=127.0.0.1:56076
TestLockCommand_TryLock - 2019/12/30 19:09:24.503488 [INFO] agent: Requesting shutdown
TestLockCommand_TryLock - 2019/12/30 19:09:24.503617 [INFO] consul: shutting down server
TestLockCommand_TryLock - 2019/12/30 19:09:24.503771 [WARN] serf: Shutdown without a Leave
TestLockCommand - 2019/12/30 19:09:24.509660 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (6.497176ms) from=127.0.0.1:47832
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.518541 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (35.126617ms) from=127.0.0.1:48706
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.539834 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (951.693µs) from=127.0.0.1:48714
TestLockCommand_TryLock - 2019/12/30 19:09:24.652715 [WARN] serf: Shutdown without a Leave
TestLockCommand_TryLock - 2019/12/30 19:09:24.727586 [INFO] manager: shutting down
TestLockCommand - 2019/12/30 19:09:24.728467 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=62d2e473-8e2a-5661-495b-72ea2184de29 (202.223468ms) from=127.0.0.1:47836
TestLockCommand - 2019/12/30 19:09:24.729535 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (215.117151ms) from=127.0.0.1:47832
TestLockCommand_TryLock - 2019/12/30 19:09:24.732289 [INFO] agent: consul server down
TestLockCommand_TryLock - 2019/12/30 19:09:24.732361 [INFO] agent: shutdown complete
TestLockCommand_TryLock - 2019/12/30 19:09:24.732417 [INFO] agent: Stopping DNS server 127.0.0.1:16025 (tcp)
TestLockCommand_TryLock - 2019/12/30 19:09:24.732578 [INFO] agent: Stopping DNS server 127.0.0.1:16025 (udp)
TestLockCommand_TryLock - 2019/12/30 19:09:24.733122 [INFO] agent: Stopping HTTP server 127.0.0.1:16026 (tcp)
TestLockCommand_TryLock - 2019/12/30 19:09:24.733922 [ERR] consul.session: Apply failed: leadership lost while committing log
TestLockCommand - 2019/12/30 19:09:24.733933 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (2.137724ms) from=127.0.0.1:47832
TestLockCommand_TryLock - 2019/12/30 19:09:24.734060 [ERR] http: Request PUT /v1/session/destroy/02f3df2e-7474-9998-68c9-03c7eda0b24f, error: leadership lost while committing log from=127.0.0.1:56070
TestLockCommand_TryLock - 2019/12/30 19:09:24.735248 [WARN] consul: error getting server health from "Node 3fce67b3-545e-ad4f-91ac-8613c66241fb": rpc error making call: EOF
TestLockCommand_NoShell - 2019/12/30 19:09:24.737749 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_NoShell - 2019/12/30 19:09:24.738189 [DEBUG] consul: Skipping self join check for "Node d4f6ecd3-04fe-52c9-c245-625113f61ca4" since the cluster is too small
TestLockCommand_NoShell - 2019/12/30 19:09:24.738343 [INFO] consul: member 'Node d4f6ecd3-04fe-52c9-c245-625113f61ca4' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.738554 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.740814 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (198.059356ms) from=127.0.0.1:48714
TestLockCommand_TryLock - 2019/12/30 19:09:24.742508 [DEBUG] http: Request PUT /v1/session/destroy/02f3df2e-7474-9998-68c9-03c7eda0b24f (530.534014ms) from=127.0.0.1:56070
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.743391 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (218.057564ms) from=127.0.0.1:48706
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.995586 [DEBUG] http: Request DELETE /v1/kv/test/prefix/1d06be19-1ede-a9a5-1fb1-ff5a3126b973 (250.042762ms) from=127.0.0.1:48714
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:24.998757 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (648.017µs) from=127.0.0.1:48714
TestLockCommand_NoShell - 2019/12/30 19:09:25.023749 [DEBUG] http: Request GET /v1/agent/self (5.672153ms) from=127.0.0.1:57648
TestLockCommand - 2019/12/30 19:09:25.143054 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand - 2019/12/30 19:09:25.143128 [DEBUG] agent: Node info in sync
TestLockCommand - 2019/12/30 19:09:25.219729 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand - 2019/12/30 19:09:25.221680 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (483.580744ms) from=127.0.0.1:47832
TestLockCommand - 2019/12/30 19:09:25.223132 [DEBUG] http: Request PUT /v1/session/destroy/62d2e473-8e2a-5661-495b-72ea2184de29 (490.939276ms) from=127.0.0.1:47836
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.225851 [DEBUG] http: Request PUT /v1/session/destroy/1d06be19-1ede-a9a5-1fb1-ff5a3126b973 (224.504071ms) from=127.0.0.1:48706
TestLockCommand - 2019/12/30 19:09:25.227999 [INFO] agent: Requesting shutdown
TestLockCommand - 2019/12/30 19:09:25.228089 [INFO] consul: shutting down server
TestLockCommand - 2019/12/30 19:09:25.228152 [WARN] serf: Shutdown without a Leave
TestLockCommand_TryLock - 2019/12/30 19:09:25.233698 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TryLock - 2019/12/30 19:09:25.233774 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_TryLock (6.25s)
=== CONT  TestLockCommand_BadArgs
--- PASS: TestLockCommand_BadArgs (0.01s)
TestLockCommand - 2019/12/30 19:09:25.368800 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/12/30 19:09:25.370526 [DEBUG] http: Request PUT /v1/session/create (334.738718ms) from=127.0.0.1:57648
TestLockCommand_NoShell - 2019/12/30 19:09:25.374706 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (312.675µs) from=127.0.0.1:57648
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.463207 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (460.097108ms) from=127.0.0.1:48714
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.464722 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.464808 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.464854 [WARN] serf: Shutdown without a Leave
TestLockCommand_TryLock - 2019/12/30 19:09:25.492706 [WARN] consul: error getting server health from "Node 3fce67b3-545e-ad4f-91ac-8613c66241fb": context deadline exceeded
TestLockCommand - 2019/12/30 19:09:25.535535 [INFO] manager: shutting down
TestLockCommand - 2019/12/30 19:09:25.536441 [INFO] agent: consul server down
TestLockCommand - 2019/12/30 19:09:25.536508 [INFO] agent: shutdown complete
TestLockCommand - 2019/12/30 19:09:25.536579 [INFO] agent: Stopping DNS server 127.0.0.1:16037 (tcp)
TestLockCommand - 2019/12/30 19:09:25.536748 [INFO] agent: Stopping DNS server 127.0.0.1:16037 (udp)
TestLockCommand - 2019/12/30 19:09:25.536921 [INFO] agent: Stopping HTTP server 127.0.0.1:16038 (tcp)
TestLockCommand - 2019/12/30 19:09:25.537514 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand - 2019/12/30 19:09:25.537623 [INFO] agent: Endpoints down
--- PASS: TestLockCommand (5.84s)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.547590 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/12/30 19:09:25.628282 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=6a4c7a99-fb8b-f48e-5d55-d54eee42cac2&flags=3304740253564472344 (252.287822ms) from=127.0.0.1:57648
TestLockCommand_NoShell - 2019/12/30 19:09:25.644145 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (6.386506ms) from=127.0.0.1:57648
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.727147 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.727931 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.727988 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.728062 [INFO] agent: Stopping DNS server 127.0.0.1:16031 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.728212 [INFO] agent: Stopping DNS server 127.0.0.1:16031 (udp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.728423 [INFO] agent: Stopping HTTP server 127.0.0.1:16032 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.729093 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/30 19:09:25.729226 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Semaphore_Default (6.74s)
TestLockCommand_NoShell - 2019/12/30 19:09:25.895680 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=6a4c7a99-fb8b-f48e-5d55-d54eee42cac2 (236.511728ms) from=127.0.0.1:57650
TestLockCommand_NoShell - 2019/12/30 19:09:25.898754 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (251.505801ms) from=127.0.0.1:57648
TestLockCommand_NoShell - 2019/12/30 19:09:25.904582 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_NoShell - 2019/12/30 19:09:25.908067 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.336037ms) from=127.0.0.1:57650
TestLockCommand_NoShell - 2019/12/30 19:09:26.153823 [DEBUG] http: Request PUT /v1/session/destroy/6a4c7a99-fb8b-f48e-5d55-d54eee42cac2 (247.812367ms) from=127.0.0.1:57648
TestLockCommand_NoShell - 2019/12/30 19:09:26.428600 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (517.117316ms) from=127.0.0.1:57650
TestLockCommand_NoShell - 2019/12/30 19:09:26.430813 [INFO] agent: Requesting shutdown
TestLockCommand_NoShell - 2019/12/30 19:09:26.430914 [INFO] consul: shutting down server
TestLockCommand_NoShell - 2019/12/30 19:09:26.430980 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/12/30 19:09:27.118764 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/12/30 19:09:27.210450 [INFO] manager: shutting down
TestLockCommand_NoShell - 2019/12/30 19:09:27.211208 [INFO] agent: consul server down
TestLockCommand_NoShell - 2019/12/30 19:09:27.211270 [INFO] agent: shutdown complete
TestLockCommand_NoShell - 2019/12/30 19:09:27.211353 [INFO] agent: Stopping DNS server 127.0.0.1:16043 (tcp)
TestLockCommand_NoShell - 2019/12/30 19:09:27.211491 [INFO] agent: Stopping DNS server 127.0.0.1:16043 (udp)
TestLockCommand_NoShell - 2019/12/30 19:09:27.211651 [INFO] agent: Stopping HTTP server 127.0.0.1:16044 (tcp)
TestLockCommand_NoShell - 2019/12/30 19:09:27.212272 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_NoShell - 2019/12/30 19:09:27.212360 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_NoShell (6.61s)
PASS
ok  	github.com/hashicorp/consul/command/lock	15.152s
=== RUN   TestLoginCommand_noTabs
=== PAUSE TestLoginCommand_noTabs
=== RUN   TestLoginCommand
=== PAUSE TestLoginCommand
=== RUN   TestLoginCommand_k8s
=== PAUSE TestLoginCommand_k8s
=== CONT  TestLoginCommand_noTabs
=== CONT  TestLoginCommand_k8s
--- PASS: TestLoginCommand_noTabs (0.00s)
=== CONT  TestLoginCommand
WARNING: bootstrap = true: do not enable unless necessary
TestLoginCommand_k8s - 2019/12/30 19:09:20.443202 [WARN] agent: Node name "Node 564999af-e8d4-d3b7-f9d4-40af99505f91" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLoginCommand_k8s - 2019/12/30 19:09:20.451317 [DEBUG] tlsutil: Update with version 1
TestLoginCommand_k8s - 2019/12/30 19:09:20.462911 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLoginCommand - 2019/12/30 19:09:20.498609 [WARN] agent: Node name "Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLoginCommand - 2019/12/30 19:09:20.499709 [DEBUG] tlsutil: Update with version 1
TestLoginCommand - 2019/12/30 19:09:20.505828 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:09:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737 Address:127.0.0.1:52012}]
2019/12/30 19:09:22 [INFO]  raft: Node at 127.0.0.1:52012 [Follower] entering Follower state (Leader: "")
2019/12/30 19:09:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:564999af-e8d4-d3b7-f9d4-40af99505f91 Address:127.0.0.1:52006}]
2019/12/30 19:09:22 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestLoginCommand - 2019/12/30 19:09:22.358807 [INFO] serf: EventMemberJoin: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737.dc1 127.0.0.1
TestLoginCommand_k8s - 2019/12/30 19:09:22.358807 [INFO] serf: EventMemberJoin: Node 564999af-e8d4-d3b7-f9d4-40af99505f91.dc1 127.0.0.1
TestLoginCommand_k8s - 2019/12/30 19:09:22.362733 [INFO] serf: EventMemberJoin: Node 564999af-e8d4-d3b7-f9d4-40af99505f91 127.0.0.1
TestLoginCommand_k8s - 2019/12/30 19:09:22.363533 [INFO] consul: Adding LAN server Node 564999af-e8d4-d3b7-f9d4-40af99505f91 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestLoginCommand_k8s - 2019/12/30 19:09:22.363644 [INFO] consul: Handled member-join event for server "Node 564999af-e8d4-d3b7-f9d4-40af99505f91.dc1" in area "wan"
TestLoginCommand_k8s - 2019/12/30 19:09:22.364304 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestLoginCommand_k8s - 2019/12/30 19:09:22.364734 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestLoginCommand - 2019/12/30 19:09:22.365662 [INFO] serf: EventMemberJoin: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737 127.0.0.1
TestLoginCommand - 2019/12/30 19:09:22.366891 [INFO] agent: Started DNS server 127.0.0.1:52007 (udp)
TestLoginCommand - 2019/12/30 19:09:22.367340 [INFO] consul: Adding LAN server Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737 (Addr: tcp/127.0.0.1:52012) (DC: dc1)
TestLoginCommand - 2019/12/30 19:09:22.367593 [INFO] consul: Handled member-join event for server "Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737.dc1" in area "wan"
TestLoginCommand_k8s - 2019/12/30 19:09:22.367645 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestLoginCommand_k8s - 2019/12/30 19:09:22.367741 [INFO] agent: started state syncer
TestLoginCommand - 2019/12/30 19:09:22.368097 [INFO] agent: Started DNS server 127.0.0.1:52007 (tcp)
TestLoginCommand - 2019/12/30 19:09:22.370635 [INFO] agent: Started HTTP server on 127.0.0.1:52008 (tcp)
TestLoginCommand - 2019/12/30 19:09:22.370754 [INFO] agent: started state syncer
2019/12/30 19:09:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:22 [INFO]  raft: Node at 127.0.0.1:52012 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:22 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:23 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
2019/12/30 19:09:23 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:23 [INFO]  raft: Node at 127.0.0.1:52012 [Leader] entering Leader state
TestLoginCommand_k8s - 2019/12/30 19:09:23.087572 [INFO] consul: cluster leadership acquired
TestLoginCommand - 2019/12/30 19:09:23.088162 [INFO] consul: cluster leadership acquired
TestLoginCommand_k8s - 2019/12/30 19:09:23.088288 [INFO] consul: New leader elected: Node 564999af-e8d4-d3b7-f9d4-40af99505f91
TestLoginCommand - 2019/12/30 19:09:23.088620 [INFO] consul: New leader elected: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737
TestLoginCommand_k8s - 2019/12/30 19:09:23.112520 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand_k8s - 2019/12/30 19:09:23.116028 [INFO] acl: initializing acls
TestLoginCommand - 2019/12/30 19:09:23.120649 [INFO] acl: initializing acls
TestLoginCommand_k8s - 2019/12/30 19:09:23.282682 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand - 2019/12/30 19:09:23.338378 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand_k8s - 2019/12/30 19:09:23.469250 [INFO] acl: initializing acls
TestLoginCommand_k8s - 2019/12/30 19:09:23.469712 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand_k8s - 2019/12/30 19:09:23.469778 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand - 2019/12/30 19:09:23.472888 [INFO] acl: initializing acls
TestLoginCommand - 2019/12/30 19:09:23.473070 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand - 2019/12/30 19:09:23.473129 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand_k8s - 2019/12/30 19:09:23.736646 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand_k8s - 2019/12/30 19:09:23.736758 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand - 2019/12/30 19:09:24.045540 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand - 2019/12/30 19:09:24.045875 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand - 2019/12/30 19:09:24.045942 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand_k8s - 2019/12/30 19:09:24.047413 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand_k8s - 2019/12/30 19:09:24.200796 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand - 2019/12/30 19:09:24.446862 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand - 2019/12/30 19:09:24.471611 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand - 2019/12/30 19:09:24.472940 [INFO] serf: EventMemberUpdate: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737
TestLoginCommand - 2019/12/30 19:09:24.473764 [INFO] serf: EventMemberUpdate: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737.dc1
TestLoginCommand - 2019/12/30 19:09:24.474765 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand - 2019/12/30 19:09:24.474842 [DEBUG] acl: transitioning out of legacy ACL mode
TestLoginCommand - 2019/12/30 19:09:24.475738 [INFO] serf: EventMemberUpdate: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737
TestLoginCommand - 2019/12/30 19:09:24.476396 [INFO] serf: EventMemberUpdate: Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737.dc1
TestLoginCommand_k8s - 2019/12/30 19:09:24.485596 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand_k8s - 2019/12/30 19:09:24.485761 [DEBUG] acl: transitioning out of legacy ACL mode
TestLoginCommand_k8s - 2019/12/30 19:09:24.486859 [INFO] serf: EventMemberUpdate: Node 564999af-e8d4-d3b7-f9d4-40af99505f91
TestLoginCommand_k8s - 2019/12/30 19:09:24.487626 [INFO] serf: EventMemberUpdate: Node 564999af-e8d4-d3b7-f9d4-40af99505f91.dc1
TestLoginCommand_k8s - 2019/12/30 19:09:24.743954 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand_k8s - 2019/12/30 19:09:24.750365 [INFO] serf: EventMemberUpdate: Node 564999af-e8d4-d3b7-f9d4-40af99505f91
TestLoginCommand_k8s - 2019/12/30 19:09:24.751948 [INFO] serf: EventMemberUpdate: Node 564999af-e8d4-d3b7-f9d4-40af99505f91.dc1
TestLoginCommand - 2019/12/30 19:09:25.627705 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLoginCommand - 2019/12/30 19:09:25.628434 [DEBUG] consul: Skipping self join check for "Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737" since the cluster is too small
TestLoginCommand - 2019/12/30 19:09:25.628578 [INFO] consul: member 'Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737' joined, marking health alive
TestLoginCommand - 2019/12/30 19:09:25.896377 [DEBUG] consul: Skipping self join check for "Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737" since the cluster is too small
TestLoginCommand - 2019/12/30 19:09:25.896973 [DEBUG] consul: Skipping self join check for "Node 5ee8bd74-8deb-cf3f-d6ab-ba8cdbb9d737" since the cluster is too small
=== RUN   TestLoginCommand/method_is_required
=== RUN   TestLoginCommand/token-sink-file_is_required
=== RUN   TestLoginCommand/bearer-token-file_is_required
=== RUN   TestLoginCommand/bearer-token-file_is_empty
=== RUN   TestLoginCommand/try_login_with_no_method_configured
TestLoginCommand - 2019/12/30 19:09:25.927488 [ERR] http: Request POST /v1/acl/login, error: ACL not found from=127.0.0.1:45712
TestLoginCommand - 2019/12/30 19:09:25.940666 [DEBUG] http: Request POST /v1/acl/login (13.836374ms) from=127.0.0.1:45712
TestLoginCommand_k8s - 2019/12/30 19:09:26.078906 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLoginCommand_k8s - 2019/12/30 19:09:26.079967 [DEBUG] consul: Skipping self join check for "Node 564999af-e8d4-d3b7-f9d4-40af99505f91" since the cluster is too small
TestLoginCommand_k8s - 2019/12/30 19:09:26.080103 [INFO] consul: member 'Node 564999af-e8d4-d3b7-f9d4-40af99505f91' joined, marking health alive
TestLoginCommand - 2019/12/30 19:09:26.157567 [DEBUG] http: Request PUT /v1/acl/auth-method (212.412743ms) from=127.0.0.1:45714
=== RUN   TestLoginCommand/try_login_with_method_configured_but_no_binding_rules
TestLoginCommand - 2019/12/30 19:09:26.190817 [DEBUG] acl: updating cached auth method validator for "test"
TestLoginCommand - 2019/12/30 19:09:26.190996 [ERR] http: Request POST /v1/acl/login, error: Permission denied from=127.0.0.1:45716
TestLoginCommand - 2019/12/30 19:09:26.191900 [DEBUG] http: Request POST /v1/acl/login (1.711713ms) from=127.0.0.1:45716
TestLoginCommand_k8s - 2019/12/30 19:09:26.268155 [DEBUG] consul: Skipping self join check for "Node 564999af-e8d4-d3b7-f9d4-40af99505f91" since the cluster is too small
TestLoginCommand_k8s - 2019/12/30 19:09:26.268766 [DEBUG] consul: Skipping self join check for "Node 564999af-e8d4-d3b7-f9d4-40af99505f91" since the cluster is too small
TestLoginCommand - 2019/12/30 19:09:27.120480 [DEBUG] http: Request PUT /v1/acl/binding-rule (925.797033ms) from=127.0.0.1:45714
=== RUN   TestLoginCommand/try_login_with_method_configured_and_binding_rules
TestLoginCommand - 2019/12/30 19:09:27.122604 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLoginCommand_k8s - 2019/12/30 19:09:27.271971 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLoginCommand_k8s - 2019/12/30 19:09:27.277083 [DEBUG] http: Request PUT /v1/acl/auth-method (794.372146ms) from=127.0.0.1:36020
TestLoginCommand_k8s - 2019/12/30 19:09:27.292061 [DEBUG] acl: updating cached auth method validator for "k8s"
TestLoginCommand - 2019/12/30 19:09:27.401650 [DEBUG] http: Request POST /v1/acl/login (274.611758ms) from=127.0.0.1:45720
TestLoginCommand - 2019/12/30 19:09:27.579529 [INFO] agent: Requesting shutdown
TestLoginCommand - 2019/12/30 19:09:27.579634 [INFO] consul: shutting down server
TestLoginCommand - 2019/12/30 19:09:27.579697 [WARN] serf: Shutdown without a Leave
TestLoginCommand_k8s - 2019/12/30 19:09:27.628562 [DEBUG] http: Request PUT /v1/acl/binding-rule (341.297895ms) from=127.0.0.1:36020
=== RUN   TestLoginCommand_k8s/try_login_with_method_configured_and_binding_rules
TestLoginCommand - 2019/12/30 19:09:27.677113 [WARN] serf: Shutdown without a Leave
TestLoginCommand - 2019/12/30 19:09:27.727138 [INFO] manager: shutting down
TestLoginCommand - 2019/12/30 19:09:27.727898 [INFO] agent: consul server down
TestLoginCommand - 2019/12/30 19:09:27.727964 [INFO] agent: shutdown complete
TestLoginCommand - 2019/12/30 19:09:27.728024 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (tcp)
TestLoginCommand - 2019/12/30 19:09:27.728178 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (udp)
TestLoginCommand - 2019/12/30 19:09:27.728376 [INFO] agent: Stopping HTTP server 127.0.0.1:52008 (tcp)
TestLoginCommand - 2019/12/30 19:09:27.729508 [INFO] agent: Waiting for endpoints to shut down
TestLoginCommand - 2019/12/30 19:09:27.729613 [INFO] agent: Endpoints down
--- PASS: TestLoginCommand (7.42s)
    --- PASS: TestLoginCommand/method_is_required (0.00s)
    --- PASS: TestLoginCommand/token-sink-file_is_required (0.00s)
    --- PASS: TestLoginCommand/bearer-token-file_is_required (0.00s)
    --- PASS: TestLoginCommand/bearer-token-file_is_empty (0.00s)
    --- PASS: TestLoginCommand/try_login_with_no_method_configured (0.02s)
    --- PASS: TestLoginCommand/try_login_with_method_configured_but_no_binding_rules (0.03s)
    --- PASS: TestLoginCommand/try_login_with_method_configured_and_binding_rules (0.46s)
TestLoginCommand_k8s - 2019/12/30 19:09:27.930728 [DEBUG] http: Request POST /v1/acl/login (294.967975ms) from=127.0.0.1:36024
TestLoginCommand_k8s - 2019/12/30 19:09:27.988288 [INFO] agent: Requesting shutdown
TestLoginCommand_k8s - 2019/12/30 19:09:27.988404 [INFO] consul: shutting down server
TestLoginCommand_k8s - 2019/12/30 19:09:27.988456 [WARN] serf: Shutdown without a Leave
TestLoginCommand_k8s - 2019/12/30 19:09:28.043771 [WARN] serf: Shutdown without a Leave
TestLoginCommand_k8s - 2019/12/30 19:09:28.093836 [INFO] manager: shutting down
TestLoginCommand_k8s - 2019/12/30 19:09:28.094703 [INFO] agent: consul server down
TestLoginCommand_k8s - 2019/12/30 19:09:28.094778 [INFO] agent: shutdown complete
TestLoginCommand_k8s - 2019/12/30 19:09:28.094840 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestLoginCommand_k8s - 2019/12/30 19:09:28.094995 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestLoginCommand_k8s - 2019/12/30 19:09:28.095167 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestLoginCommand_k8s - 2019/12/30 19:09:28.095858 [INFO] agent: Waiting for endpoints to shut down
TestLoginCommand_k8s - 2019/12/30 19:09:28.095994 [INFO] agent: Endpoints down
--- PASS: TestLoginCommand_k8s (7.78s)
    --- PASS: TestLoginCommand_k8s/try_login_with_method_configured_and_binding_rules (0.36s)
PASS
ok  	github.com/hashicorp/consul/command/login	8.295s
=== RUN   TestLogout_noTabs
=== PAUSE TestLogout_noTabs
=== RUN   TestLogoutCommand
=== PAUSE TestLogoutCommand
=== RUN   TestLogoutCommand_k8s
=== PAUSE TestLogoutCommand_k8s
=== CONT  TestLogout_noTabs
=== CONT  TestLogoutCommand_k8s
=== CONT  TestLogoutCommand
--- PASS: TestLogout_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestLogoutCommand_k8s - 2019/12/30 19:09:29.255882 [WARN] agent: Node name "Node b862e08f-ce6c-ea3e-b26b-596a61336080" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLogoutCommand_k8s - 2019/12/30 19:09:29.256963 [DEBUG] tlsutil: Update with version 1
TestLogoutCommand_k8s - 2019/12/30 19:09:29.297840 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLogoutCommand - 2019/12/30 19:09:29.313260 [WARN] agent: Node name "Node 5f684315-2289-3762-8992-d92f4be5d839" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLogoutCommand - 2019/12/30 19:09:29.314339 [DEBUG] tlsutil: Update with version 1
TestLogoutCommand - 2019/12/30 19:09:29.317304 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:09:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b862e08f-ce6c-ea3e-b26b-596a61336080 Address:127.0.0.1:34012}]
2019/12/30 19:09:30 [INFO]  raft: Node at 127.0.0.1:34012 [Follower] entering Follower state (Leader: "")
TestLogoutCommand_k8s - 2019/12/30 19:09:30.837667 [INFO] serf: EventMemberJoin: Node b862e08f-ce6c-ea3e-b26b-596a61336080.dc1 127.0.0.1
TestLogoutCommand_k8s - 2019/12/30 19:09:30.841774 [INFO] serf: EventMemberJoin: Node b862e08f-ce6c-ea3e-b26b-596a61336080 127.0.0.1
TestLogoutCommand_k8s - 2019/12/30 19:09:30.843650 [INFO] consul: Adding LAN server Node b862e08f-ce6c-ea3e-b26b-596a61336080 (Addr: tcp/127.0.0.1:34012) (DC: dc1)
TestLogoutCommand_k8s - 2019/12/30 19:09:30.844230 [INFO] consul: Handled member-join event for server "Node b862e08f-ce6c-ea3e-b26b-596a61336080.dc1" in area "wan"
TestLogoutCommand_k8s - 2019/12/30 19:09:30.845253 [INFO] agent: Started DNS server 127.0.0.1:34007 (udp)
TestLogoutCommand_k8s - 2019/12/30 19:09:30.846212 [INFO] agent: Started DNS server 127.0.0.1:34007 (tcp)
TestLogoutCommand_k8s - 2019/12/30 19:09:30.851411 [INFO] agent: Started HTTP server on 127.0.0.1:34008 (tcp)
TestLogoutCommand_k8s - 2019/12/30 19:09:30.851588 [INFO] agent: started state syncer
2019/12/30 19:09:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:30 [INFO]  raft: Node at 127.0.0.1:34012 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5f684315-2289-3762-8992-d92f4be5d839 Address:127.0.0.1:34006}]
2019/12/30 19:09:30 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
TestLogoutCommand - 2019/12/30 19:09:30.931552 [INFO] serf: EventMemberJoin: Node 5f684315-2289-3762-8992-d92f4be5d839.dc1 127.0.0.1
TestLogoutCommand - 2019/12/30 19:09:30.959278 [INFO] serf: EventMemberJoin: Node 5f684315-2289-3762-8992-d92f4be5d839 127.0.0.1
TestLogoutCommand - 2019/12/30 19:09:30.961110 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestLogoutCommand - 2019/12/30 19:09:30.961235 [INFO] consul: Adding LAN server Node 5f684315-2289-3762-8992-d92f4be5d839 (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestLogoutCommand - 2019/12/30 19:09:30.961517 [INFO] consul: Handled member-join event for server "Node 5f684315-2289-3762-8992-d92f4be5d839.dc1" in area "wan"
TestLogoutCommand - 2019/12/30 19:09:30.963212 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestLogoutCommand - 2019/12/30 19:09:30.965876 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestLogoutCommand - 2019/12/30 19:09:30.966100 [INFO] agent: started state syncer
2019/12/30 19:09:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:30 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:31 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
2019/12/30 19:09:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:31 [INFO]  raft: Node at 127.0.0.1:34012 [Leader] entering Leader state
TestLogoutCommand_k8s - 2019/12/30 19:09:31.423893 [INFO] consul: cluster leadership acquired
TestLogoutCommand - 2019/12/30 19:09:31.424122 [INFO] consul: cluster leadership acquired
TestLogoutCommand - 2019/12/30 19:09:31.424670 [INFO] consul: New leader elected: Node 5f684315-2289-3762-8992-d92f4be5d839
TestLogoutCommand_k8s - 2019/12/30 19:09:31.424946 [INFO] consul: New leader elected: Node b862e08f-ce6c-ea3e-b26b-596a61336080
TestLogoutCommand_k8s - 2019/12/30 19:09:31.443560 [ERR] agent: failed to sync remote state: ACL not found
TestLogoutCommand - 2019/12/30 19:09:31.517690 [ERR] agent: failed to sync remote state: ACL not found
TestLogoutCommand_k8s - 2019/12/30 19:09:31.597788 [INFO] acl: initializing acls
TestLogoutCommand - 2019/12/30 19:09:31.703210 [INFO] acl: initializing acls
TestLogoutCommand - 2019/12/30 19:09:31.714178 [INFO] acl: initializing acls
TestLogoutCommand_k8s - 2019/12/30 19:09:31.761640 [INFO] acl: initializing acls
TestLogoutCommand_k8s - 2019/12/30 19:09:31.914241 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand_k8s - 2019/12/30 19:09:31.914323 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand - 2019/12/30 19:09:32.178443 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand - 2019/12/30 19:09:32.178566 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand - 2019/12/30 19:09:32.184526 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand - 2019/12/30 19:09:32.184676 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand_k8s - 2019/12/30 19:09:32.799087 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand_k8s - 2019/12/30 19:09:32.799631 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand - 2019/12/30 19:09:33.329010 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand - 2019/12/30 19:09:33.329138 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand_k8s - 2019/12/30 19:09:33.554460 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand_k8s - 2019/12/30 19:09:33.554486 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand_k8s - 2019/12/30 19:09:33.856924 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand_k8s - 2019/12/30 19:09:33.862233 [INFO] serf: EventMemberUpdate: Node b862e08f-ce6c-ea3e-b26b-596a61336080
TestLogoutCommand_k8s - 2019/12/30 19:09:33.863755 [INFO] serf: EventMemberUpdate: Node b862e08f-ce6c-ea3e-b26b-596a61336080.dc1
TestLogoutCommand - 2019/12/30 19:09:33.920777 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand - 2019/12/30 19:09:33.921672 [INFO] serf: EventMemberUpdate: Node 5f684315-2289-3762-8992-d92f4be5d839
TestLogoutCommand - 2019/12/30 19:09:33.922278 [INFO] serf: EventMemberUpdate: Node 5f684315-2289-3762-8992-d92f4be5d839.dc1
TestLogoutCommand - 2019/12/30 19:09:33.924323 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand - 2019/12/30 19:09:33.924680 [DEBUG] acl: transitioning out of legacy ACL mode
TestLogoutCommand - 2019/12/30 19:09:33.926010 [INFO] serf: EventMemberUpdate: Node 5f684315-2289-3762-8992-d92f4be5d839
TestLogoutCommand - 2019/12/30 19:09:33.927457 [INFO] serf: EventMemberUpdate: Node 5f684315-2289-3762-8992-d92f4be5d839.dc1
TestLogoutCommand_k8s - 2019/12/30 19:09:34.161901 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand_k8s - 2019/12/30 19:09:34.162025 [DEBUG] acl: transitioning out of legacy ACL mode
TestLogoutCommand_k8s - 2019/12/30 19:09:34.163022 [INFO] serf: EventMemberUpdate: Node b862e08f-ce6c-ea3e-b26b-596a61336080
TestLogoutCommand_k8s - 2019/12/30 19:09:34.163635 [INFO] serf: EventMemberUpdate: Node b862e08f-ce6c-ea3e-b26b-596a61336080.dc1
TestLogoutCommand_k8s - 2019/12/30 19:09:34.495042 [INFO] agent: Synced node info
TestLogoutCommand_k8s - 2019/12/30 19:09:34.495220 [DEBUG] agent: Node info in sync
=== RUN   TestLogoutCommand_k8s/no_token_specified
TestLogoutCommand_k8s - 2019/12/30 19:09:34.509222 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:48140
TestLogoutCommand_k8s - 2019/12/30 19:09:34.511211 [DEBUG] http: Request POST /v1/acl/logout (1.957387ms) from=127.0.0.1:48140
=== RUN   TestLogoutCommand_k8s/logout_of_deleted_token
TestLogoutCommand_k8s - 2019/12/30 19:09:34.535519 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:48142
TestLogoutCommand_k8s - 2019/12/30 19:09:34.536528 [DEBUG] http: Request POST /v1/acl/logout (1.236366ms) from=127.0.0.1:48142
TestLogoutCommand - 2019/12/30 19:09:34.571587 [INFO] agent: Synced node info
TestLogoutCommand - 2019/12/30 19:09:34.571717 [DEBUG] agent: Node info in sync
=== RUN   TestLogoutCommand/no_token_specified
TestLogoutCommand - 2019/12/30 19:09:34.585103 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:58082
TestLogoutCommand - 2019/12/30 19:09:34.586006 [DEBUG] http: Request POST /v1/acl/logout (925.358µs) from=127.0.0.1:58082
=== RUN   TestLogoutCommand/logout_of_deleted_token
TestLogoutCommand - 2019/12/30 19:09:34.593258 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:58084
TestLogoutCommand - 2019/12/30 19:09:34.594855 [DEBUG] http: Request POST /v1/acl/logout (1.737047ms) from=127.0.0.1:58084
TestLogoutCommand_k8s - 2019/12/30 19:09:35.121971 [DEBUG] http: Request PUT /v1/acl/token (556.563375ms) from=127.0.0.1:48144
=== RUN   TestLogoutCommand_k8s/logout_of_ordinary_token
TestLogoutCommand_k8s - 2019/12/30 19:09:35.128267 [ERR] http: Request POST /v1/acl/logout, error: Permission denied from=127.0.0.1:48152
TestLogoutCommand_k8s - 2019/12/30 19:09:35.128959 [DEBUG] http: Request POST /v1/acl/logout (885.69µs) from=127.0.0.1:48152
TestLogoutCommand - 2019/12/30 19:09:35.189178 [DEBUG] http: Request PUT /v1/acl/token (589.961278ms) from=127.0.0.1:58086
=== RUN   TestLogoutCommand/logout_of_ordinary_token
TestLogoutCommand - 2019/12/30 19:09:35.195075 [ERR] http: Request POST /v1/acl/logout, error: Permission denied from=127.0.0.1:58090
TestLogoutCommand - 2019/12/30 19:09:35.196627 [DEBUG] http: Request POST /v1/acl/logout (1.692046ms) from=127.0.0.1:58090
TestLogoutCommand_k8s - 2019/12/30 19:09:35.704671 [DEBUG] http: Request PUT /v1/acl/auth-method (569.712397ms) from=127.0.0.1:48144
TestLogoutCommand_k8s - 2019/12/30 19:09:35.717211 [DEBUG] acl: updating cached auth method validator for "k8s"
TestLogoutCommand - 2019/12/30 19:09:35.795823 [DEBUG] http: Request PUT /v1/acl/auth-method (596.802796ms) from=127.0.0.1:58086
TestLogoutCommand - 2019/12/30 19:09:35.798880 [DEBUG] acl: updating cached auth method validator for "test"
TestLogoutCommand - 2019/12/30 19:09:36.139298 [DEBUG] http: Request PUT /v1/acl/binding-rule (341.240223ms) from=127.0.0.1:58086
TestLogoutCommand_k8s - 2019/12/30 19:09:36.347665 [DEBUG] http: Request PUT /v1/acl/binding-rule (635.559176ms) from=127.0.0.1:48144
TestLogoutCommand_k8s - 2019/12/30 19:09:36.654329 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLogoutCommand_k8s - 2019/12/30 19:09:36.654826 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLogoutCommand - 2019/12/30 19:09:36.736215 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLogoutCommand - 2019/12/30 19:09:36.744505 [DEBUG] http: Request POST /v1/acl/login (602.283944ms) from=127.0.0.1:58086
=== RUN   TestLogoutCommand/logout_of_login_token
TestLogoutCommand_k8s - 2019/12/30 19:09:36.963355 [DEBUG] consul: Skipping self join check for "Node b862e08f-ce6c-ea3e-b26b-596a61336080" since the cluster is too small
TestLogoutCommand_k8s - 2019/12/30 19:09:36.963589 [INFO] consul: member 'Node b862e08f-ce6c-ea3e-b26b-596a61336080' joined, marking health alive
TestLogoutCommand_k8s - 2019/12/30 19:09:36.965559 [DEBUG] http: Request POST /v1/acl/login (614.765948ms) from=127.0.0.1:48144
=== RUN   TestLogoutCommand_k8s/logout_of_login_token
TestLogoutCommand - 2019/12/30 19:09:37.036508 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLogoutCommand - 2019/12/30 19:09:37.302493 [DEBUG] consul: Skipping self join check for "Node 5f684315-2289-3762-8992-d92f4be5d839" since the cluster is too small
TestLogoutCommand - 2019/12/30 19:09:37.302726 [INFO] consul: member 'Node 5f684315-2289-3762-8992-d92f4be5d839' joined, marking health alive
TestLogoutCommand - 2019/12/30 19:09:37.305024 [DEBUG] http: Request POST /v1/acl/logout (551.24523ms) from=127.0.0.1:58094
TestLogoutCommand - 2019/12/30 19:09:37.306323 [INFO] agent: Requesting shutdown
TestLogoutCommand - 2019/12/30 19:09:37.306401 [INFO] consul: shutting down server
TestLogoutCommand - 2019/12/30 19:09:37.306449 [WARN] serf: Shutdown without a Leave
TestLogoutCommand - 2019/12/30 19:09:37.477466 [WARN] serf: Shutdown without a Leave
TestLogoutCommand - 2019/12/30 19:09:37.544056 [INFO] manager: shutting down
TestLogoutCommand_k8s - 2019/12/30 19:09:37.544248 [DEBUG] consul: Skipping self join check for "Node b862e08f-ce6c-ea3e-b26b-596a61336080" since the cluster is too small
TestLogoutCommand_k8s - 2019/12/30 19:09:37.544880 [DEBUG] consul: Skipping self join check for "Node b862e08f-ce6c-ea3e-b26b-596a61336080" since the cluster is too small
TestLogoutCommand - 2019/12/30 19:09:37.544973 [ERR] consul: failed to reconcile member: {Node 5f684315-2289-3762-8992-d92f4be5d839 127.0.0.1 34004 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:5f684315-2289-3762-8992-d92f4be5d839 port:34006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:34005] alive 1 5 2 2 5 4}: leadership lost while committing log
TestLogoutCommand_k8s - 2019/12/30 19:09:37.545706 [DEBUG] http: Request POST /v1/acl/logout (573.958844ms) from=127.0.0.1:48160
TestLogoutCommand - 2019/12/30 19:09:37.546851 [INFO] agent: consul server down
TestLogoutCommand - 2019/12/30 19:09:37.546928 [INFO] agent: shutdown complete
TestLogoutCommand - 2019/12/30 19:09:37.546986 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestLogoutCommand - 2019/12/30 19:09:37.547123 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestLogoutCommand - 2019/12/30 19:09:37.547274 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestLogoutCommand_k8s - 2019/12/30 19:09:37.547756 [INFO] agent: Requesting shutdown
TestLogoutCommand_k8s - 2019/12/30 19:09:37.547817 [INFO] consul: shutting down server
TestLogoutCommand_k8s - 2019/12/30 19:09:37.547860 [WARN] serf: Shutdown without a Leave
TestLogoutCommand - 2019/12/30 19:09:37.548137 [INFO] agent: Waiting for endpoints to shut down
TestLogoutCommand - 2019/12/30 19:09:37.548335 [INFO] agent: Endpoints down
--- PASS: TestLogoutCommand (8.46s)
    --- PASS: TestLogoutCommand/no_token_specified (0.01s)
    --- PASS: TestLogoutCommand/logout_of_deleted_token (0.01s)
    --- PASS: TestLogoutCommand/logout_of_ordinary_token (0.01s)
    --- PASS: TestLogoutCommand/logout_of_login_token (0.56s)
TestLogoutCommand_k8s - 2019/12/30 19:09:37.594027 [WARN] serf: Shutdown without a Leave
TestLogoutCommand_k8s - 2019/12/30 19:09:37.635725 [INFO] manager: shutting down
TestLogoutCommand_k8s - 2019/12/30 19:09:37.636513 [INFO] agent: consul server down
TestLogoutCommand_k8s - 2019/12/30 19:09:37.636571 [INFO] agent: shutdown complete
TestLogoutCommand_k8s - 2019/12/30 19:09:37.636624 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (tcp)
TestLogoutCommand_k8s - 2019/12/30 19:09:37.636753 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (udp)
TestLogoutCommand_k8s - 2019/12/30 19:09:37.636893 [INFO] agent: Stopping HTTP server 127.0.0.1:34008 (tcp)
TestLogoutCommand_k8s - 2019/12/30 19:09:37.637787 [INFO] agent: Waiting for endpoints to shut down
TestLogoutCommand_k8s - 2019/12/30 19:09:37.637894 [INFO] agent: Endpoints down
--- PASS: TestLogoutCommand_k8s (8.55s)
    --- PASS: TestLogoutCommand_k8s/no_token_specified (0.01s)
    --- PASS: TestLogoutCommand_k8s/logout_of_deleted_token (0.02s)
    --- PASS: TestLogoutCommand_k8s/logout_of_ordinary_token (0.01s)
    --- PASS: TestLogoutCommand_k8s/logout_of_login_token (0.58s)
PASS
ok  	github.com/hashicorp/consul/command/logout	8.850s
=== RUN   TestMaintCommand_noTabs
=== PAUSE TestMaintCommand_noTabs
=== RUN   TestMaintCommand_ConflictingArgs
=== PAUSE TestMaintCommand_ConflictingArgs
=== RUN   TestMaintCommand_NoArgs
=== PAUSE TestMaintCommand_NoArgs
=== RUN   TestMaintCommand_EnableNodeMaintenance
=== PAUSE TestMaintCommand_EnableNodeMaintenance
=== RUN   TestMaintCommand_DisableNodeMaintenance
=== PAUSE TestMaintCommand_DisableNodeMaintenance
=== RUN   TestMaintCommand_EnableServiceMaintenance
=== PAUSE TestMaintCommand_EnableServiceMaintenance
=== RUN   TestMaintCommand_DisableServiceMaintenance
=== PAUSE TestMaintCommand_DisableServiceMaintenance
=== RUN   TestMaintCommand_ServiceMaintenance_NoService
=== PAUSE TestMaintCommand_ServiceMaintenance_NoService
=== CONT  TestMaintCommand_noTabs
=== CONT  TestMaintCommand_ServiceMaintenance_NoService
=== CONT  TestMaintCommand_DisableServiceMaintenance
=== CONT  TestMaintCommand_EnableServiceMaintenance
--- PASS: TestMaintCommand_noTabs (0.02s)
=== CONT  TestMaintCommand_DisableNodeMaintenance
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:40.735644 [WARN] agent: Node name "Node 87e311b7-6f13-6416-17a2-d9b54f965f3b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:40.736575 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:40.771113 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:40.778898 [WARN] agent: Node name "Node cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:40.779319 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:40.781590 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:40.819158 [WARN] agent: Node name "Node 2c0dafb0-e50f-ad20-8629-66bb2630866a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:40.819601 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:40.821740 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:40.862671 [WARN] agent: Node name "Node db7c3e1d-cf45-a610-522a-5972ee79be23" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:40.863267 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:40.870130 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:09:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:87e311b7-6f13-6416-17a2-d9b54f965f3b Address:127.0.0.1:11506}]
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11506 [Follower] entering Follower state (Leader: "")
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.749336 [INFO] serf: EventMemberJoin: Node 87e311b7-6f13-6416-17a2-d9b54f965f3b.dc1 127.0.0.1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.752908 [INFO] serf: EventMemberJoin: Node 87e311b7-6f13-6416-17a2-d9b54f965f3b 127.0.0.1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.754254 [INFO] consul: Adding LAN server Node 87e311b7-6f13-6416-17a2-d9b54f965f3b (Addr: tcp/127.0.0.1:11506) (DC: dc1)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.754793 [INFO] consul: Handled member-join event for server "Node 87e311b7-6f13-6416-17a2-d9b54f965f3b.dc1" in area "wan"
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.755714 [INFO] agent: Started DNS server 127.0.0.1:11501 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.755973 [INFO] agent: Started DNS server 127.0.0.1:11501 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.759182 [INFO] agent: Started HTTP server on 127.0.0.1:11502 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:41.759377 [INFO] agent: started state syncer
2019/12/30 19:09:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11506 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:db7c3e1d-cf45-a610-522a-5972ee79be23 Address:127.0.0.1:11518}]
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11518 [Follower] entering Follower state (Leader: "")
2019/12/30 19:09:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d Address:127.0.0.1:11524}]
2019/12/30 19:09:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2c0dafb0-e50f-ad20-8629-66bb2630866a Address:127.0.0.1:11512}]
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11512 [Follower] entering Follower state (Leader: "")
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.849297 [INFO] serf: EventMemberJoin: Node db7c3e1d-cf45-a610-522a-5972ee79be23.dc1 127.0.0.1
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11524 [Follower] entering Follower state (Leader: "")
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.853258 [INFO] serf: EventMemberJoin: Node cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d.dc1 127.0.0.1
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.853408 [INFO] serf: EventMemberJoin: Node 2c0dafb0-e50f-ad20-8629-66bb2630866a.dc1 127.0.0.1
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.897570 [INFO] serf: EventMemberJoin: Node db7c3e1d-cf45-a610-522a-5972ee79be23 127.0.0.1
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.897694 [INFO] serf: EventMemberJoin: Node cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d 127.0.0.1
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.899034 [INFO] agent: Started DNS server 127.0.0.1:11513 (udp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.900142 [INFO] consul: Handled member-join event for server "Node db7c3e1d-cf45-a610-522a-5972ee79be23.dc1" in area "wan"
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.900850 [INFO] consul: Adding LAN server Node db7c3e1d-cf45-a610-522a-5972ee79be23 (Addr: tcp/127.0.0.1:11518) (DC: dc1)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.901155 [INFO] serf: EventMemberJoin: Node 2c0dafb0-e50f-ad20-8629-66bb2630866a 127.0.0.1
2019/12/30 19:09:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11518 [Candidate] entering Candidate state in term 2
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.902682 [INFO] agent: Started DNS server 127.0.0.1:11507 (udp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.903275 [INFO] agent: Started DNS server 127.0.0.1:11513 (tcp)
2019/12/30 19:09:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11524 [Candidate] entering Candidate state in term 2
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.907197 [INFO] agent: Started HTTP server on 127.0.0.1:11514 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:41.907327 [INFO] agent: started state syncer
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.912217 [INFO] agent: Started DNS server 127.0.0.1:11519 (udp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.913268 [INFO] agent: Started DNS server 127.0.0.1:11519 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.927769 [INFO] agent: Started HTTP server on 127.0.0.1:11520 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.927904 [INFO] agent: started state syncer
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.936037 [INFO] consul: Adding LAN server Node cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d (Addr: tcp/127.0.0.1:11524) (DC: dc1)
2019/12/30 19:09:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:41 [INFO]  raft: Node at 127.0.0.1:11512 [Candidate] entering Candidate state in term 2
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.937895 [INFO] consul: Adding LAN server Node 2c0dafb0-e50f-ad20-8629-66bb2630866a (Addr: tcp/127.0.0.1:11512) (DC: dc1)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.938358 [INFO] consul: Handled member-join event for server "Node 2c0dafb0-e50f-ad20-8629-66bb2630866a.dc1" in area "wan"
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:41.938496 [INFO] consul: Handled member-join event for server "Node cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d.dc1" in area "wan"
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.941212 [INFO] agent: Started DNS server 127.0.0.1:11507 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.944799 [INFO] agent: Started HTTP server on 127.0.0.1:11508 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:41.944943 [INFO] agent: started state syncer
2019/12/30 19:09:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:42 [INFO]  raft: Node at 127.0.0.1:11506 [Leader] entering Leader state
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:42.337691 [INFO] consul: cluster leadership acquired
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:42.338304 [INFO] consul: New leader elected: Node 87e311b7-6f13-6416-17a2-d9b54f965f3b
2019/12/30 19:09:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:42 [INFO]  raft: Node at 127.0.0.1:11518 [Leader] entering Leader state
2019/12/30 19:09:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:42 [INFO]  raft: Node at 127.0.0.1:11524 [Leader] entering Leader state
2019/12/30 19:09:42 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:42 [INFO]  raft: Node at 127.0.0.1:11512 [Leader] entering Leader state
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.516356 [INFO] consul: cluster leadership acquired
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.516758 [INFO] consul: New leader elected: Node 2c0dafb0-e50f-ad20-8629-66bb2630866a
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.516995 [INFO] consul: cluster leadership acquired
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.517312 [INFO] consul: New leader elected: Node db7c3e1d-cf45-a610-522a-5972ee79be23
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:42.517524 [INFO] consul: cluster leadership acquired
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:42.517820 [INFO] consul: New leader elected: Node cd6e9c1a-5a92-0ccb-ab6f-67f758c9728d
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.846255 [INFO] agent: Synced service "test"
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.846346 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.856792 [INFO] agent: Synced node info
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.856997 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.861579 [DEBUG] http: Request GET /v1/agent/self (313.816144ms) from=127.0.0.1:52564
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.867734 [DEBUG] http: Request GET /v1/agent/self (341.180884ms) from=127.0.0.1:45358
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:42.873147 [INFO] agent: Synced node info
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.897503 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.897625 [DEBUG] http: Request PUT /v1/agent/maintenance?enable=false (131.003µs) from=127.0.0.1:52564
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.905763 [INFO] agent: Requesting shutdown
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.905881 [INFO] consul: shutting down server
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:42.905940 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.914142 [DEBUG] agent: Service "test" in sync
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.914227 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.914307 [DEBUG] http: Request PUT /v1/agent/service/maintenance/test?enable=false (210.672µs) from=127.0.0.1:45358
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.915049 [INFO] agent: Requesting shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.915132 [INFO] consul: shutting down server
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:42.915233 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.027645 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.030474 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.110901 [INFO] manager: shutting down
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.110921 [INFO] manager: shutting down
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.110925 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.111358 [INFO] agent: consul server down
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.111422 [INFO] agent: shutdown complete
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.111443 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.111478 [INFO] agent: Stopping DNS server 127.0.0.1:11513 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.111625 [INFO] agent: Stopping DNS server 127.0.0.1:11513 (udp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.111819 [INFO] agent: Stopping HTTP server 127.0.0.1:11514 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.112365 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_DisableNodeMaintenance - 2019/12/30 19:09:43.112487 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_DisableNodeMaintenance (2.52s)
=== CONT  TestMaintCommand_EnableNodeMaintenance
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.129548 [INFO] agent: Synced node info
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.129784 [DEBUG] agent: Node info in sync
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.133932 [DEBUG] http: Request GET /v1/agent/self (761.851249ms) from=127.0.0.1:56836
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.143203 [DEBUG] http: Request GET /v1/agent/self (265.65051ms) from=127.0.0.1:52976
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.151717 [DEBUG] http: Request PUT /v1/agent/service/maintenance/redis?enable=true&reason=broken (1.109697ms) from=127.0.0.1:56836
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.153900 [INFO] agent: Requesting shutdown
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.153995 [INFO] consul: shutting down server
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.154050 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:43.192620 [WARN] agent: Node name "Node a1246610-9506-ff2c-814a-f7dc151ffc7f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:43.193215 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:43.195876 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.277615 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.278959 [INFO] agent: consul server down
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.279034 [INFO] agent: shutdown complete
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.279114 [INFO] agent: Stopping DNS server 127.0.0.1:11507 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.279293 [INFO] agent: Stopping DNS server 127.0.0.1:11507 (udp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.279530 [INFO] agent: Stopping HTTP server 127.0.0.1:11508 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.279570 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.279741 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.280040 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_DisableServiceMaintenance - 2019/12/30 19:09:43.280129 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_DisableServiceMaintenance (2.70s)
=== CONT  TestMaintCommand_NoArgs
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.286457 [INFO] agent: Service "test" entered maintenance mode
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_NoArgs - 2019/12/30 19:09:43.341542 [WARN] agent: Node name "Node 7aad4486-4469-d689-9398-e3dc6654a813" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_NoArgs - 2019/12/30 19:09:43.342050 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_NoArgs - 2019/12/30 19:09:43.344550 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.368142 [INFO] manager: shutting down
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639201 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639467 [INFO] agent: consul server down
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639522 [INFO] agent: shutdown complete
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639574 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639472 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639711 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.639873 [INFO] agent: Stopping HTTP server 127.0.0.1:11502 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.640368 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/30 19:09:43.640475 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_ServiceMaintenance_NoService (3.06s)
=== CONT  TestMaintCommand_ConflictingArgs
--- PASS: TestMaintCommand_ConflictingArgs (0.00s)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.814950 [INFO] agent: Synced service "test"
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.815091 [DEBUG] agent: Check "_service_maintenance:test" in sync
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.815159 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.815237 [DEBUG] http: Request PUT /v1/agent/service/maintenance/test?enable=true&reason=broken (652.027281ms) from=127.0.0.1:52976
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.816293 [INFO] agent: Requesting shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.816375 [INFO] consul: shutting down server
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.816421 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.895095 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.960972 [INFO] manager: shutting down
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.961630 [INFO] agent: consul server down
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.961695 [INFO] agent: shutdown complete
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.961752 [INFO] agent: Stopping DNS server 127.0.0.1:11519 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.961913 [INFO] agent: Stopping DNS server 127.0.0.1:11519 (udp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.962165 [INFO] agent: Stopping HTTP server 127.0.0.1:11520 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.962716 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.962816 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestMaintCommand_EnableServiceMaintenance - 2019/12/30 19:09:43.962857 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_EnableServiceMaintenance (3.39s)
2019/12/30 19:09:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a1246610-9506-ff2c-814a-f7dc151ffc7f Address:127.0.0.1:11530}]
2019/12/30 19:09:44 [INFO]  raft: Node at 127.0.0.1:11530 [Follower] entering Follower state (Leader: "")
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.490155 [INFO] serf: EventMemberJoin: Node a1246610-9506-ff2c-814a-f7dc151ffc7f.dc1 127.0.0.1
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.494025 [INFO] serf: EventMemberJoin: Node a1246610-9506-ff2c-814a-f7dc151ffc7f 127.0.0.1
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.494677 [INFO] consul: Handled member-join event for server "Node a1246610-9506-ff2c-814a-f7dc151ffc7f.dc1" in area "wan"
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.494791 [INFO] consul: Adding LAN server Node a1246610-9506-ff2c-814a-f7dc151ffc7f (Addr: tcp/127.0.0.1:11530) (DC: dc1)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.495297 [INFO] agent: Started DNS server 127.0.0.1:11525 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.495366 [INFO] agent: Started DNS server 127.0.0.1:11525 (udp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.497744 [INFO] agent: Started HTTP server on 127.0.0.1:11526 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:44.497845 [INFO] agent: started state syncer
2019/12/30 19:09:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:44 [INFO]  raft: Node at 127.0.0.1:11530 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7aad4486-4469-d689-9398-e3dc6654a813 Address:127.0.0.1:11536}]
2019/12/30 19:09:44 [INFO]  raft: Node at 127.0.0.1:11536 [Follower] entering Follower state (Leader: "")
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.971549 [INFO] serf: EventMemberJoin: Node 7aad4486-4469-d689-9398-e3dc6654a813.dc1 127.0.0.1
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.980417 [INFO] serf: EventMemberJoin: Node 7aad4486-4469-d689-9398-e3dc6654a813 127.0.0.1
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.984526 [INFO] consul: Adding LAN server Node 7aad4486-4469-d689-9398-e3dc6654a813 (Addr: tcp/127.0.0.1:11536) (DC: dc1)
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.985420 [INFO] consul: Handled member-join event for server "Node 7aad4486-4469-d689-9398-e3dc6654a813.dc1" in area "wan"
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.987367 [INFO] agent: Started DNS server 127.0.0.1:11531 (tcp)
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.988257 [INFO] agent: Started DNS server 127.0.0.1:11531 (udp)
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.991660 [INFO] agent: Started HTTP server on 127.0.0.1:11532 (tcp)
TestMaintCommand_NoArgs - 2019/12/30 19:09:44.991768 [INFO] agent: started state syncer
2019/12/30 19:09:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:09:45 [INFO]  raft: Node at 127.0.0.1:11536 [Candidate] entering Candidate state in term 2
2019/12/30 19:09:45 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:45 [INFO]  raft: Node at 127.0.0.1:11530 [Leader] entering Leader state
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:45.379697 [INFO] consul: cluster leadership acquired
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:45.380120 [INFO] consul: New leader elected: Node a1246610-9506-ff2c-814a-f7dc151ffc7f
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:45.778596 [INFO] agent: Synced node info
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:45.778736 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:45.787305 [DEBUG] http: Request GET /v1/agent/self (103.468795ms) from=127.0.0.1:38254
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.004246 [INFO] agent: Node entered maintenance mode
2019/12/30 19:09:46 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:09:46 [INFO]  raft: Node at 127.0.0.1:11536 [Leader] entering Leader state
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.010086 [INFO] consul: cluster leadership acquired
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.010497 [INFO] consul: New leader elected: Node 7aad4486-4469-d689-9398-e3dc6654a813
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.403654 [INFO] agent: Synced node info
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.471374 [INFO] agent: Service "test" entered maintenance mode
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.558403 [INFO] agent: Node entered maintenance mode
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.561762 [INFO] agent: Synced check "_node_maintenance"
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.561855 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.561932 [DEBUG] http: Request PUT /v1/agent/maintenance?enable=true&reason=broken (756.003754ms) from=127.0.0.1:38254
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.562755 [INFO] agent: Requesting shutdown
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.562865 [INFO] consul: shutting down server
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.562913 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.645133 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.732763 [INFO] manager: shutting down
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.734962 [INFO] agent: consul server down
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.735031 [INFO] agent: shutdown complete
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.735091 [INFO] agent: Stopping DNS server 127.0.0.1:11525 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.735243 [INFO] agent: Stopping DNS server 127.0.0.1:11525 (udp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.735428 [INFO] agent: Stopping HTTP server 127.0.0.1:11526 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.735940 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.736037 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_EnableNodeMaintenance (3.62s)
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.737134 [DEBUG] http: Request GET /v1/agent/self (168.601221ms) from=127.0.0.1:45312
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.748687 [DEBUG] http: Request GET /v1/agent/checks (851.356µs) from=127.0.0.1:45312
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.752328 [INFO] agent: Requesting shutdown
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.752727 [INFO] consul: shutting down server
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.753002 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/12/30 19:09:46.763963 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.852515 [WARN] serf: Shutdown without a Leave
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.930081 [INFO] manager: shutting down
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.986696 [INFO] agent: consul server down
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.987010 [INFO] agent: shutdown complete
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.987372 [INFO] agent: Stopping DNS server 127.0.0.1:11531 (tcp)
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.987816 [INFO] agent: Stopping DNS server 127.0.0.1:11531 (udp)
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.988405 [INFO] agent: Stopping HTTP server 127.0.0.1:11532 (tcp)
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.988785 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.990676 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_NoArgs - 2019/12/30 19:09:46.991051 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_NoArgs (3.71s)
PASS
ok  	github.com/hashicorp/consul/command/maint	6.871s
=== RUN   TestMembersCommand_noTabs
=== PAUSE TestMembersCommand_noTabs
=== RUN   TestMembersCommand
=== PAUSE TestMembersCommand
=== RUN   TestMembersCommand_WAN
=== PAUSE TestMembersCommand_WAN
=== RUN   TestMembersCommand_statusFilter
=== PAUSE TestMembersCommand_statusFilter
=== RUN   TestMembersCommand_statusFilter_failed
=== PAUSE TestMembersCommand_statusFilter_failed
=== CONT  TestMembersCommand_noTabs
=== CONT  TestMembersCommand_statusFilter
=== CONT  TestMembersCommand_statusFilter_failed
=== CONT  TestMembersCommand_WAN
=== CONT  TestMembersCommand
--- PASS: TestMembersCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:15.196308 [WARN] agent: Node name "Node 032aa5af-75b5-256a-d68c-292c5bcb98f7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:15.197151 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:15.203781 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand - 2019/12/30 19:10:15.223233 [WARN] agent: Node name "Node 198662b4-4f79-8f09-c638-93937dd3f94c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand - 2019/12/30 19:10:15.223815 [DEBUG] tlsutil: Update with version 1
TestMembersCommand - 2019/12/30 19:10:15.227333 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_WAN - 2019/12/30 19:10:15.242246 [WARN] agent: Node name "Node eb34e960-62a9-6022-3c16-e78df36c6df8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_WAN - 2019/12/30 19:10:15.245964 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_WAN - 2019/12/30 19:10:15.251531 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_statusFilter - 2019/12/30 19:10:15.272613 [WARN] agent: Node name "Node ab381104-16d5-becf-9007-4d2e95ec8249" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_statusFilter - 2019/12/30 19:10:15.273463 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_statusFilter - 2019/12/30 19:10:15.277078 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:10:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:032aa5af-75b5-256a-d68c-292c5bcb98f7 Address:127.0.0.1:19012}]
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.184039 [INFO] serf: EventMemberJoin: Node 032aa5af-75b5-256a-d68c-292c5bcb98f7.dc1 127.0.0.1
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19012 [Follower] entering Follower state (Leader: "")
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.190322 [INFO] serf: EventMemberJoin: Node 032aa5af-75b5-256a-d68c-292c5bcb98f7 127.0.0.1
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.193663 [INFO] consul: Adding LAN server Node 032aa5af-75b5-256a-d68c-292c5bcb98f7 (Addr: tcp/127.0.0.1:19012) (DC: dc1)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.194246 [INFO] consul: Handled member-join event for server "Node 032aa5af-75b5-256a-d68c-292c5bcb98f7.dc1" in area "wan"
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.196376 [INFO] agent: Started DNS server 127.0.0.1:19007 (tcp)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.196574 [INFO] agent: Started DNS server 127.0.0.1:19007 (udp)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.202008 [INFO] agent: Started HTTP server on 127.0.0.1:19008 (tcp)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.202214 [INFO] agent: started state syncer
2019/12/30 19:10:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19012 [Candidate] entering Candidate state in term 2
2019/12/30 19:10:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ab381104-16d5-becf-9007-4d2e95ec8249 Address:127.0.0.1:19006}]
2019/12/30 19:10:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb34e960-62a9-6022-3c16-e78df36c6df8 Address:127.0.0.1:19018}]
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19018 [Follower] entering Follower state (Leader: "")
TestMembersCommand_WAN - 2019/12/30 19:10:16.467164 [INFO] serf: EventMemberJoin: Node eb34e960-62a9-6022-3c16-e78df36c6df8.dc1 127.0.0.1
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.471947 [INFO] serf: EventMemberJoin: Node ab381104-16d5-becf-9007-4d2e95ec8249.dc1 127.0.0.1
2019/12/30 19:10:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:198662b4-4f79-8f09-c638-93937dd3f94c Address:127.0.0.1:19024}]
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19024 [Follower] entering Follower state (Leader: "")
TestMembersCommand_WAN - 2019/12/30 19:10:16.476600 [INFO] serf: EventMemberJoin: Node eb34e960-62a9-6022-3c16-e78df36c6df8 127.0.0.1
TestMembersCommand_WAN - 2019/12/30 19:10:16.477639 [INFO] consul: Adding LAN server Node eb34e960-62a9-6022-3c16-e78df36c6df8 (Addr: tcp/127.0.0.1:19018) (DC: dc1)
TestMembersCommand_WAN - 2019/12/30 19:10:16.477663 [INFO] consul: Handled member-join event for server "Node eb34e960-62a9-6022-3c16-e78df36c6df8.dc1" in area "wan"
TestMembersCommand_WAN - 2019/12/30 19:10:16.505386 [INFO] agent: Started DNS server 127.0.0.1:19013 (tcp)
TestMembersCommand_WAN - 2019/12/30 19:10:16.505486 [INFO] agent: Started DNS server 127.0.0.1:19013 (udp)
TestMembersCommand - 2019/12/30 19:10:16.507203 [INFO] serf: EventMemberJoin: Node 198662b4-4f79-8f09-c638-93937dd3f94c.dc1 127.0.0.1
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.508041 [INFO] serf: EventMemberJoin: Node ab381104-16d5-becf-9007-4d2e95ec8249 127.0.0.1
TestMembersCommand_WAN - 2019/12/30 19:10:16.508669 [INFO] agent: Started HTTP server on 127.0.0.1:19014 (tcp)
TestMembersCommand_WAN - 2019/12/30 19:10:16.508770 [INFO] agent: started state syncer
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.510457 [INFO] consul: Handled member-join event for server "Node ab381104-16d5-becf-9007-4d2e95ec8249.dc1" in area "wan"
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.510507 [INFO] consul: Adding LAN server Node ab381104-16d5-becf-9007-4d2e95ec8249 (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestMembersCommand - 2019/12/30 19:10:16.510850 [INFO] serf: EventMemberJoin: Node 198662b4-4f79-8f09-c638-93937dd3f94c 127.0.0.1
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.511153 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.511225 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestMembersCommand - 2019/12/30 19:10:16.511492 [INFO] consul: Adding LAN server Node 198662b4-4f79-8f09-c638-93937dd3f94c (Addr: tcp/127.0.0.1:19024) (DC: dc1)
TestMembersCommand - 2019/12/30 19:10:16.512073 [INFO] consul: Handled member-join event for server "Node 198662b4-4f79-8f09-c638-93937dd3f94c.dc1" in area "wan"
TestMembersCommand - 2019/12/30 19:10:16.512291 [INFO] agent: Started DNS server 127.0.0.1:19019 (udp)
TestMembersCommand - 2019/12/30 19:10:16.512857 [INFO] agent: Started DNS server 127.0.0.1:19019 (tcp)
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.514105 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestMembersCommand_statusFilter - 2019/12/30 19:10:16.514216 [INFO] agent: started state syncer
2019/12/30 19:10:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/12/30 19:10:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19018 [Candidate] entering Candidate state in term 2
TestMembersCommand - 2019/12/30 19:10:16.528190 [INFO] agent: Started HTTP server on 127.0.0.1:19020 (tcp)
TestMembersCommand - 2019/12/30 19:10:16.528312 [INFO] agent: started state syncer
2019/12/30 19:10:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19024 [Candidate] entering Candidate state in term 2
2019/12/30 19:10:16 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:16 [INFO]  raft: Node at 127.0.0.1:19012 [Leader] entering Leader state
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.831114 [INFO] consul: cluster leadership acquired
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:16.831569 [INFO] consul: New leader elected: Node 032aa5af-75b5-256a-d68c-292c5bcb98f7
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.003766 [DEBUG] http: Request GET /v1/agent/members?segment=_all (1.808716ms) from=127.0.0.1:53926
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.007709 [INFO] agent: Requesting shutdown
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.007802 [INFO] consul: shutting down server
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.007865 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.144820 [WARN] serf: Shutdown without a Leave
2019/12/30 19:10:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:17 [INFO]  raft: Node at 127.0.0.1:19018 [Leader] entering Leader state
2019/12/30 19:10:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:17 [INFO]  raft: Node at 127.0.0.1:19024 [Leader] entering Leader state
TestMembersCommand_WAN - 2019/12/30 19:10:17.148154 [INFO] consul: cluster leadership acquired
TestMembersCommand_WAN - 2019/12/30 19:10:17.148563 [INFO] consul: New leader elected: Node eb34e960-62a9-6022-3c16-e78df36c6df8
2019/12/30 19:10:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:17 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestMembersCommand - 2019/12/30 19:10:17.149094 [INFO] consul: cluster leadership acquired
TestMembersCommand - 2019/12/30 19:10:17.149551 [INFO] consul: New leader elected: Node 198662b4-4f79-8f09-c638-93937dd3f94c
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.149843 [INFO] consul: cluster leadership acquired
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.150170 [INFO] consul: New leader elected: Node ab381104-16d5-becf-9007-4d2e95ec8249
TestMembersCommand_WAN - 2019/12/30 19:10:17.239272 [DEBUG] http: Request GET /v1/agent/members?segment=_all&wan=1 (1.023027ms) from=127.0.0.1:33056
TestMembersCommand_WAN - 2019/12/30 19:10:17.243112 [INFO] agent: Requesting shutdown
TestMembersCommand_WAN - 2019/12/30 19:10:17.243230 [INFO] consul: shutting down server
TestMembersCommand_WAN - 2019/12/30 19:10:17.243296 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/30 19:10:17.243541 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.244937 [INFO] manager: shutting down
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.246659 [INFO] agent: consul server down
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.246719 [INFO] agent: shutdown complete
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.246772 [INFO] agent: Stopping DNS server 127.0.0.1:19007 (tcp)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.246899 [INFO] agent: Stopping DNS server 127.0.0.1:19007 (udp)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.247047 [INFO] agent: Stopping HTTP server 127.0.0.1:19008 (tcp)
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.247522 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.247608 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.247702 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.247751 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestMembersCommand_statusFilter_failed - 2019/12/30 19:10:17.247921 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_statusFilter_failed (2.21s)
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.297183 [DEBUG] http: Request GET /v1/agent/members?segment=_all (999.36µs) from=127.0.0.1:58158
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.299465 [INFO] agent: Requesting shutdown
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.299560 [INFO] consul: shutting down server
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.299603 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/30 19:10:17.338273 [WARN] serf: Shutdown without a Leave
TestMembersCommand - 2019/12/30 19:10:17.408966 [DEBUG] http: Request GET /v1/agent/members?segment=_all (4.390785ms) from=127.0.0.1:38660
TestMembersCommand - 2019/12/30 19:10:17.427977 [INFO] agent: Requesting shutdown
TestMembersCommand - 2019/12/30 19:10:17.428091 [INFO] consul: shutting down server
TestMembersCommand - 2019/12/30 19:10:17.428144 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.453185 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/30 19:10:17.456947 [INFO] manager: shutting down
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.561723 [INFO] manager: shutting down
TestMembersCommand_WAN - 2019/12/30 19:10:17.561932 [INFO] agent: consul server down
TestMembersCommand_WAN - 2019/12/30 19:10:17.562000 [INFO] agent: shutdown complete
TestMembersCommand_WAN - 2019/12/30 19:10:17.561940 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMembersCommand_WAN - 2019/12/30 19:10:17.562136 [INFO] agent: Stopping DNS server 127.0.0.1:19013 (tcp)
TestMembersCommand - 2019/12/30 19:10:17.562287 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/30 19:10:17.562396 [INFO] agent: Stopping DNS server 127.0.0.1:19013 (udp)
TestMembersCommand_WAN - 2019/12/30 19:10:17.562552 [INFO] agent: Stopping HTTP server 127.0.0.1:19014 (tcp)
TestMembersCommand_WAN - 2019/12/30 19:10:17.563085 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_WAN - 2019/12/30 19:10:17.563177 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_WAN (2.53s)
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.563749 [INFO] agent: consul server down
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.563810 [INFO] agent: shutdown complete
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.563845 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.563899 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.563869 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.563777 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.564104 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.564264 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.564801 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_statusFilter - 2019/12/30 19:10:17.564919 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_statusFilter (2.54s)
TestMembersCommand - 2019/12/30 19:10:17.565494 [INFO] agent: Synced node info
TestMembersCommand - 2019/12/30 19:10:17.565604 [DEBUG] agent: Node info in sync
TestMembersCommand - 2019/12/30 19:10:17.636709 [INFO] manager: shutting down
TestMembersCommand - 2019/12/30 19:10:17.711604 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMembersCommand - 2019/12/30 19:10:17.711854 [INFO] agent: consul server down
TestMembersCommand - 2019/12/30 19:10:17.711907 [INFO] agent: shutdown complete
TestMembersCommand - 2019/12/30 19:10:17.711970 [INFO] agent: Stopping DNS server 127.0.0.1:19019 (tcp)
TestMembersCommand - 2019/12/30 19:10:17.712140 [INFO] agent: Stopping DNS server 127.0.0.1:19019 (udp)
TestMembersCommand - 2019/12/30 19:10:17.712233 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMembersCommand - 2019/12/30 19:10:17.712329 [INFO] agent: Stopping HTTP server 127.0.0.1:19020 (tcp)
TestMembersCommand - 2019/12/30 19:10:17.712379 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestMembersCommand - 2019/12/30 19:10:17.712435 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestMembersCommand - 2019/12/30 19:10:17.712492 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestMembersCommand - 2019/12/30 19:10:17.712547 [ERR] consul: failed to transfer leadership in 3 attempts
TestMembersCommand - 2019/12/30 19:10:17.712826 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand - 2019/12/30 19:10:17.712909 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand (2.66s)
PASS
ok  	github.com/hashicorp/consul/command/members	2.954s
=== RUN   TestMonitorCommand_exitsOnSignalBeforeLinesArrive
=== PAUSE TestMonitorCommand_exitsOnSignalBeforeLinesArrive
=== CONT  TestMonitorCommand_exitsOnSignalBeforeLinesArrive
WARNING: bootstrap = true: do not enable unless necessary
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:16.100527 [WARN] agent: Node name "Node 36514fc4-87d5-730a-93ca-085e12fef46d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:16.101528 [DEBUG] tlsutil: Update with version 1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:16.107952 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:10:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:36514fc4-87d5-730a-93ca-085e12fef46d Address:127.0.0.1:11506}]
2019/12/30 19:10:17 [INFO]  raft: Node at 127.0.0.1:11506 [Follower] entering Follower state (Leader: "")
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.359041 [INFO] serf: EventMemberJoin: Node 36514fc4-87d5-730a-93ca-085e12fef46d.dc1 127.0.0.1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.374144 [INFO] serf: EventMemberJoin: Node 36514fc4-87d5-730a-93ca-085e12fef46d 127.0.0.1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.375513 [INFO] consul: Handled member-join event for server "Node 36514fc4-87d5-730a-93ca-085e12fef46d.dc1" in area "wan"
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.375564 [INFO] consul: Adding LAN server Node 36514fc4-87d5-730a-93ca-085e12fef46d (Addr: tcp/127.0.0.1:11506) (DC: dc1)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.376328 [INFO] agent: Started DNS server 127.0.0.1:11501 (udp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.376418 [INFO] agent: Started DNS server 127.0.0.1:11501 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.379636 [INFO] agent: Started HTTP server on 127.0.0.1:11502 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.379837 [INFO] agent: started state syncer
2019/12/30 19:10:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:17 [INFO]  raft: Node at 127.0.0.1:11506 [Candidate] entering Candidate state in term 2
2019/12/30 19:10:17 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:17 [INFO]  raft: Node at 127.0.0.1:11506 [Leader] entering Leader state
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.980424 [INFO] consul: cluster leadership acquired
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:17.980974 [INFO] consul: New leader elected: Node 36514fc4-87d5-730a-93ca-085e12fef46d
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:18.654105 [INFO] agent: Synced node info
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.187128 [INFO] agent: Requesting shutdown
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.187266 [INFO] consul: shutting down server
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.187339 [WARN] serf: Shutdown without a Leave
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.303230 [WARN] serf: Shutdown without a Leave
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.345695 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.346413 [DEBUG] consul: Skipping self join check for "Node 36514fc4-87d5-730a-93ca-085e12fef46d" since the cluster is too small
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.346646 [INFO] consul: member 'Node 36514fc4-87d5-730a-93ca-085e12fef46d' joined, marking health alive
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.428322 [INFO] manager: shutting down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.563467 [INFO] agent: consul server down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.563552 [INFO] agent: shutdown complete
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.563637 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.563840 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (udp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.563999 [INFO] agent: Stopping HTTP server 127.0.0.1:11502 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:19.564011 [ERR] consul: failed to reconcile member: {Node 36514fc4-87d5-730a-93ca-085e12fef46d 127.0.0.1 11504 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:36514fc4-87d5-730a-93ca-085e12fef46d port:11506 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:11505] alive 1 5 2 2 5 4}: leadership lost while committing log
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:20.564558 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:11502 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:20.564699 [INFO] agent: Waiting for endpoints to shut down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/30 19:10:20.564756 [INFO] agent: Endpoints down
--- FAIL: TestMonitorCommand_exitsOnSignalBeforeLinesArrive (4.54s)
    monitor_test.go:70: timed out waiting for exit
FAIL
FAIL	github.com/hashicorp/consul/command/monitor	4.812s
=== RUN   TestOperatorCommand_noTabs
=== PAUSE TestOperatorCommand_noTabs
=== CONT  TestOperatorCommand_noTabs
--- PASS: TestOperatorCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator	0.039s
=== RUN   TestOperatorAutopilotCommand_noTabs
=== PAUSE TestOperatorAutopilotCommand_noTabs
=== CONT  TestOperatorAutopilotCommand_noTabs
--- PASS: TestOperatorAutopilotCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot	0.048s
=== RUN   TestOperatorAutopilotGetConfigCommand_noTabs
=== PAUSE TestOperatorAutopilotGetConfigCommand_noTabs
=== RUN   TestOperatorAutopilotGetConfigCommand
=== PAUSE TestOperatorAutopilotGetConfigCommand
=== CONT  TestOperatorAutopilotGetConfigCommand_noTabs
=== CONT  TestOperatorAutopilotGetConfigCommand
--- PASS: TestOperatorAutopilotGetConfigCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:34.490447 [WARN] agent: Node name "Node d336c23f-0964-37d6-3480-2c73bd603feb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:34.491383 [DEBUG] tlsutil: Update with version 1
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:34.498466 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:10:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d336c23f-0964-37d6-3480-2c73bd603feb Address:127.0.0.1:20506}]
2019/12/30 19:10:35 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.275460 [INFO] serf: EventMemberJoin: Node d336c23f-0964-37d6-3480-2c73bd603feb.dc1 127.0.0.1
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.281789 [INFO] serf: EventMemberJoin: Node d336c23f-0964-37d6-3480-2c73bd603feb 127.0.0.1
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.283044 [INFO] consul: Adding LAN server Node d336c23f-0964-37d6-3480-2c73bd603feb (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.284521 [INFO] consul: Handled member-join event for server "Node d336c23f-0964-37d6-3480-2c73bd603feb.dc1" in area "wan"
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.292971 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.293365 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.296181 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.296296 [INFO] agent: started state syncer
2019/12/30 19:10:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:35 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/30 19:10:35 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:35 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.762468 [INFO] consul: cluster leadership acquired
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:35.763079 [INFO] consul: New leader elected: Node d336c23f-0964-37d6-3480-2c73bd603feb
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:36.196356 [INFO] agent: Synced node info
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:36.196500 [DEBUG] agent: Node info in sync
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.005140 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.006594 [DEBUG] consul: Skipping self join check for "Node d336c23f-0964-37d6-3480-2c73bd603feb" since the cluster is too small
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.007020 [INFO] consul: member 'Node d336c23f-0964-37d6-3480-2c73bd603feb' joined, marking health alive
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.198442 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (1.987054ms) from=127.0.0.1:33642
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.201695 [INFO] agent: Requesting shutdown
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.201983 [INFO] consul: shutting down server
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.202126 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.270269 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.337807 [INFO] agent: consul server down
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.337888 [INFO] agent: shutdown complete
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.337965 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.338102 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.338243 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.338776 [INFO] agent: Waiting for endpoints to shut down
TestOperatorAutopilotGetConfigCommand - 2019/12/30 19:10:37.338917 [INFO] agent: Endpoints down
--- PASS: TestOperatorAutopilotGetConfigCommand (2.92s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot/get	3.210s
=== RUN   TestOperatorAutopilotSetConfigCommand_noTabs
=== PAUSE TestOperatorAutopilotSetConfigCommand_noTabs
=== RUN   TestOperatorAutopilotSetConfigCommand
=== PAUSE TestOperatorAutopilotSetConfigCommand
=== CONT  TestOperatorAutopilotSetConfigCommand_noTabs
=== CONT  TestOperatorAutopilotSetConfigCommand
--- PASS: TestOperatorAutopilotSetConfigCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:41.905508 [WARN] agent: Node name "Node 29bfc390-6f53-35c3-8c77-5c3162f41db6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:41.906551 [DEBUG] tlsutil: Update with version 1
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:41.919355 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:10:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:29bfc390-6f53-35c3-8c77-5c3162f41db6 Address:127.0.0.1:44506}]
2019/12/30 19:10:42 [INFO]  raft: Node at 127.0.0.1:44506 [Follower] entering Follower state (Leader: "")
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.684783 [INFO] serf: EventMemberJoin: Node 29bfc390-6f53-35c3-8c77-5c3162f41db6.dc1 127.0.0.1
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.693134 [INFO] serf: EventMemberJoin: Node 29bfc390-6f53-35c3-8c77-5c3162f41db6 127.0.0.1
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.700121 [INFO] consul: Handled member-join event for server "Node 29bfc390-6f53-35c3-8c77-5c3162f41db6.dc1" in area "wan"
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.700531 [INFO] consul: Adding LAN server Node 29bfc390-6f53-35c3-8c77-5c3162f41db6 (Addr: tcp/127.0.0.1:44506) (DC: dc1)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.706715 [INFO] agent: Started DNS server 127.0.0.1:44501 (udp)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.707159 [INFO] agent: Started DNS server 127.0.0.1:44501 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.710368 [INFO] agent: Started HTTP server on 127.0.0.1:44502 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:42.710542 [INFO] agent: started state syncer
2019/12/30 19:10:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:10:42 [INFO]  raft: Node at 127.0.0.1:44506 [Candidate] entering Candidate state in term 2
2019/12/30 19:10:43 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:10:43 [INFO]  raft: Node at 127.0.0.1:44506 [Leader] entering Leader state
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:43.146973 [INFO] consul: cluster leadership acquired
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:43.147498 [INFO] consul: New leader elected: Node 29bfc390-6f53-35c3-8c77-5c3162f41db6
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:43.429833 [INFO] agent: Synced node info
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:43.429967 [DEBUG] agent: Node info in sync
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.254877 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.256614 [DEBUG] consul: Skipping self join check for "Node 29bfc390-6f53-35c3-8c77-5c3162f41db6" since the cluster is too small
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.256829 [INFO] consul: member 'Node 29bfc390-6f53-35c3-8c77-5c3162f41db6' joined, marking health alive
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.444533 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (16.633115ms) from=127.0.0.1:44332
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.613720 [DEBUG] http: Request PUT /v1/operator/autopilot/configuration?cas=5 (151.418411ms) from=127.0.0.1:44332
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.615688 [INFO] agent: Requesting shutdown
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.615778 [INFO] consul: shutting down server
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.615824 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.670456 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.734993 [INFO] manager: shutting down
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.735441 [INFO] agent: consul server down
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.735491 [INFO] agent: shutdown complete
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.735540 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.735667 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (udp)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.735812 [INFO] agent: Stopping HTTP server 127.0.0.1:44502 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.736320 [INFO] agent: Waiting for endpoints to shut down
TestOperatorAutopilotSetConfigCommand - 2019/12/30 19:10:44.736470 [INFO] agent: Endpoints down
--- PASS: TestOperatorAutopilotSetConfigCommand (2.98s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot/set	3.447s
=== RUN   TestOperatorRaftCommand_noTabs
=== PAUSE TestOperatorRaftCommand_noTabs
=== CONT  TestOperatorRaftCommand_noTabs
--- PASS: TestOperatorRaftCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft	0.075s
=== RUN   TestOperatorRaftListPeersCommand_noTabs
=== PAUSE TestOperatorRaftListPeersCommand_noTabs
=== RUN   TestOperatorRaftListPeersCommand
=== PAUSE TestOperatorRaftListPeersCommand
=== CONT  TestOperatorRaftListPeersCommand_noTabs
=== CONT  TestOperatorRaftListPeersCommand
--- PASS: TestOperatorRaftListPeersCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:07.479923 [WARN] agent: Node name "Node 8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:07.480806 [DEBUG] tlsutil: Update with version 1
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:07.487418 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:11:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba Address:127.0.0.1:43006}]
2019/12/30 19:11:08 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.229792 [INFO] serf: EventMemberJoin: Node 8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba.dc1 127.0.0.1
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.234695 [INFO] serf: EventMemberJoin: Node 8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba 127.0.0.1
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.236766 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.237406 [INFO] consul: Adding LAN server Node 8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.237628 [INFO] consul: Handled member-join event for server "Node 8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba.dc1" in area "wan"
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.238123 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.241461 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.241611 [INFO] agent: started state syncer
2019/12/30 19:11:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:11:08 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/12/30 19:11:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:11:08 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.679976 [INFO] consul: cluster leadership acquired
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.680507 [INFO] consul: New leader elected: Node 8c20d6f9-abc1-b00b-bafe-ecbcf97a70ba
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.963888 [INFO] agent: Synced node info
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:08.964023 [DEBUG] agent: Node info in sync
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.148151 [DEBUG] http: Request GET /v1/operator/raft/configuration (79.42947ms) from=127.0.0.1:35610
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.156840 [INFO] agent: Requesting shutdown
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.157134 [INFO] consul: shutting down server
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.157411 [WARN] serf: Shutdown without a Leave
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.272494 [WARN] serf: Shutdown without a Leave
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.312757 [INFO] manager: shutting down
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.504695 [INFO] agent: consul server down
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.504765 [INFO] agent: shutdown complete
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.504821 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.504949 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.505094 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.505622 [INFO] agent: Waiting for endpoints to shut down
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.505742 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestOperatorRaftListPeersCommand - 2019/12/30 19:11:09.505915 [INFO] agent: Endpoints down
--- PASS: TestOperatorRaftListPeersCommand (2.09s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft/listpeers	2.356s
=== RUN   TestOperatorRaftRemovePeerCommand_noTabs
=== PAUSE TestOperatorRaftRemovePeerCommand_noTabs
=== RUN   TestOperatorRaftRemovePeerCommand
=== PAUSE TestOperatorRaftRemovePeerCommand
=== CONT  TestOperatorRaftRemovePeerCommand_noTabs
=== CONT  TestOperatorRaftRemovePeerCommand
--- PASS: TestOperatorRaftRemovePeerCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:14.018311 [WARN] agent: Node name "Node f4c35b0e-698f-f579-8631-c211e8a5cb1c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:14.019263 [DEBUG] tlsutil: Update with version 1
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:14.027768 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:11:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f4c35b0e-698f-f579-8631-c211e8a5cb1c Address:127.0.0.1:53506}]
2019/12/30 19:11:15 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.210731 [INFO] serf: EventMemberJoin: Node f4c35b0e-698f-f579-8631-c211e8a5cb1c.dc1 127.0.0.1
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.215780 [INFO] serf: EventMemberJoin: Node f4c35b0e-698f-f579-8631-c211e8a5cb1c 127.0.0.1
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.217128 [INFO] consul: Adding LAN server Node f4c35b0e-698f-f579-8631-c211e8a5cb1c (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.217419 [INFO] consul: Handled member-join event for server "Node f4c35b0e-698f-f579-8631-c211e8a5cb1c.dc1" in area "wan"
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.218055 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.219031 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.222433 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.222677 [INFO] agent: started state syncer
2019/12/30 19:11:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:11:15 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
2019/12/30 19:11:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:11:15 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.680434 [INFO] consul: cluster leadership acquired
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:15.680978 [INFO] consul: New leader elected: Node f4c35b0e-698f-f579-8631-c211e8a5cb1c
=== RUN   TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_directly
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.088501 [ERR] http: Request DELETE /v1/operator/raft/peer?address=nope, error: address "nope" was not found in the Raft configuration from=127.0.0.1:38648
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.091518 [INFO] agent: Synced node info
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.106050 [DEBUG] http: Request DELETE /v1/operator/raft/peer?address=nope (231.195884ms) from=127.0.0.1:38648
=== RUN   TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_with_-id
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.190392 [DEBUG] agent: Node info in sync
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.190508 [DEBUG] agent: Node info in sync
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.228266 [ERR] http: Request DELETE /v1/operator/raft/peer?id=nope, error: id "nope" was not found in the Raft configuration from=127.0.0.1:38650
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.229666 [DEBUG] http: Request DELETE /v1/operator/raft/peer?id=nope (112.293686ms) from=127.0.0.1:38650
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.232371 [INFO] agent: Requesting shutdown
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.232475 [INFO] consul: shutting down server
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.232556 [WARN] serf: Shutdown without a Leave
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:16.573924 [WARN] serf: Shutdown without a Leave
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.058456 [INFO] manager: shutting down
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.404979 [INFO] agent: consul server down
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.405054 [INFO] agent: shutdown complete
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.405127 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.405278 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.405439 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.406108 [INFO] agent: Waiting for endpoints to shut down
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.406224 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.406360 [INFO] agent: Endpoints down
TestOperatorRaftRemovePeerCommand - 2019/12/30 19:11:17.406425 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
--- PASS: TestOperatorRaftRemovePeerCommand (3.47s)
    --- PASS: TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_directly (0.24s)
    --- PASS: TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_with_-id (0.12s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft/removepeer	3.824s
=== RUN   TestReloadCommand_noTabs
=== PAUSE TestReloadCommand_noTabs
=== RUN   TestReloadCommand
=== PAUSE TestReloadCommand
=== CONT  TestReloadCommand_noTabs
--- PASS: TestReloadCommand_noTabs (0.00s)
=== CONT  TestReloadCommand
WARNING: bootstrap = true: do not enable unless necessary
TestReloadCommand - 2019/12/30 19:11:24.951752 [WARN] agent: Node name "Node 901c64aa-6e15-27cd-a460-6cd38740929c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestReloadCommand - 2019/12/30 19:11:24.952667 [DEBUG] tlsutil: Update with version 1
TestReloadCommand - 2019/12/30 19:11:24.959301 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:11:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:901c64aa-6e15-27cd-a460-6cd38740929c Address:127.0.0.1:41506}]
2019/12/30 19:11:25 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
TestReloadCommand - 2019/12/30 19:11:25.667702 [INFO] serf: EventMemberJoin: Node 901c64aa-6e15-27cd-a460-6cd38740929c.dc1 127.0.0.1
TestReloadCommand - 2019/12/30 19:11:25.680102 [INFO] serf: EventMemberJoin: Node 901c64aa-6e15-27cd-a460-6cd38740929c 127.0.0.1
TestReloadCommand - 2019/12/30 19:11:25.681994 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestReloadCommand - 2019/12/30 19:11:25.683386 [INFO] consul: Adding LAN server Node 901c64aa-6e15-27cd-a460-6cd38740929c (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestReloadCommand - 2019/12/30 19:11:25.683818 [INFO] consul: Handled member-join event for server "Node 901c64aa-6e15-27cd-a460-6cd38740929c.dc1" in area "wan"
TestReloadCommand - 2019/12/30 19:11:25.684347 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestReloadCommand - 2019/12/30 19:11:25.688237 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestReloadCommand - 2019/12/30 19:11:25.688423 [INFO] agent: started state syncer
2019/12/30 19:11:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:11:25 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/12/30 19:11:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:11:26 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestReloadCommand - 2019/12/30 19:11:26.130220 [INFO] consul: cluster leadership acquired
TestReloadCommand - 2019/12/30 19:11:26.130834 [INFO] consul: New leader elected: Node 901c64aa-6e15-27cd-a460-6cd38740929c
TestReloadCommand - 2019/12/30 19:11:26.223680 [DEBUG] http: Request PUT /v1/agent/reload (51.334µs) from=127.0.0.1:40074
TestReloadCommand - 2019/12/30 19:11:26.224240 [INFO] agent: Requesting shutdown
TestReloadCommand - 2019/12/30 19:11:26.224446 [INFO] consul: shutting down server
TestReloadCommand - 2019/12/30 19:11:26.224595 [WARN] serf: Shutdown without a Leave
TestReloadCommand - 2019/12/30 19:11:26.329819 [WARN] serf: Shutdown without a Leave
TestReloadCommand - 2019/12/30 19:11:26.446507 [INFO] manager: shutting down
TestReloadCommand - 2019/12/30 19:11:26.488133 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestReloadCommand - 2019/12/30 19:11:26.488214 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestReloadCommand - 2019/12/30 19:11:26.488327 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestReloadCommand - 2019/12/30 19:11:26.488422 [INFO] agent: consul server down
TestReloadCommand - 2019/12/30 19:11:26.488471 [INFO] agent: shutdown complete
TestReloadCommand - 2019/12/30 19:11:26.488526 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestReloadCommand - 2019/12/30 19:11:26.488688 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestReloadCommand - 2019/12/30 19:11:26.488851 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestReloadCommand - 2019/12/30 19:11:26.489378 [INFO] agent: Waiting for endpoints to shut down
TestReloadCommand - 2019/12/30 19:11:26.489523 [INFO] agent: Endpoints down
--- PASS: TestReloadCommand (1.61s)
PASS
ok  	github.com/hashicorp/consul/command/reload	1.896s
=== RUN   TestRTTCommand_noTabs
=== PAUSE TestRTTCommand_noTabs
=== RUN   TestRTTCommand_BadArgs
=== PAUSE TestRTTCommand_BadArgs
=== RUN   TestRTTCommand_LAN
=== PAUSE TestRTTCommand_LAN
=== RUN   TestRTTCommand_WAN
=== PAUSE TestRTTCommand_WAN
=== CONT  TestRTTCommand_noTabs
=== CONT  TestRTTCommand_WAN
=== CONT  TestRTTCommand_LAN
=== CONT  TestRTTCommand_BadArgs
=== RUN   TestRTTCommand_BadArgs/#00
=== RUN   TestRTTCommand_BadArgs/node1_node2_node3
=== RUN   TestRTTCommand_BadArgs/-wan_node1_node2
=== RUN   TestRTTCommand_BadArgs/-wan_node1.dc1_node2
=== RUN   TestRTTCommand_BadArgs/-wan_node1_node2.dc1
--- PASS: TestRTTCommand_noTabs (0.02s)
--- PASS: TestRTTCommand_BadArgs (0.02s)
    --- PASS: TestRTTCommand_BadArgs/#00 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/node1_node2_node3 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1_node2 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1.dc1_node2 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1_node2.dc1 (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestRTTCommand_WAN - 2019/12/30 19:11:31.584519 [WARN] agent: Node name "Node 5e1fbf12-c10d-dc91-03c4-80f599459a79" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRTTCommand_WAN - 2019/12/30 19:11:31.585499 [DEBUG] tlsutil: Update with version 1
TestRTTCommand_WAN - 2019/12/30 19:11:31.592533 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestRTTCommand_LAN - 2019/12/30 19:11:31.622757 [WARN] agent: Node name "Node 1dd4207a-bbf6-c838-7568-0717d84d4f4a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRTTCommand_LAN - 2019/12/30 19:11:31.623178 [DEBUG] tlsutil: Update with version 1
TestRTTCommand_LAN - 2019/12/30 19:11:31.625520 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:11:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5e1fbf12-c10d-dc91-03c4-80f599459a79 Address:127.0.0.1:10006}]
TestRTTCommand_WAN - 2019/12/30 19:11:32.843840 [INFO] serf: EventMemberJoin: Node 5e1fbf12-c10d-dc91-03c4-80f599459a79.dc1 127.0.0.1
2019/12/30 19:11:32 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestRTTCommand_WAN - 2019/12/30 19:11:32.855168 [INFO] serf: EventMemberJoin: Node 5e1fbf12-c10d-dc91-03c4-80f599459a79 127.0.0.1
TestRTTCommand_WAN - 2019/12/30 19:11:32.857202 [INFO] consul: Adding LAN server Node 5e1fbf12-c10d-dc91-03c4-80f599459a79 (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestRTTCommand_WAN - 2019/12/30 19:11:32.857635 [INFO] consul: Handled member-join event for server "Node 5e1fbf12-c10d-dc91-03c4-80f599459a79.dc1" in area "wan"
TestRTTCommand_WAN - 2019/12/30 19:11:32.858561 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestRTTCommand_WAN - 2019/12/30 19:11:32.859100 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestRTTCommand_WAN - 2019/12/30 19:11:32.862608 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestRTTCommand_WAN - 2019/12/30 19:11:32.862819 [INFO] agent: started state syncer
2019/12/30 19:11:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:11:32 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/30 19:11:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1dd4207a-bbf6-c838-7568-0717d84d4f4a Address:127.0.0.1:10012}]
2019/12/30 19:11:33 [INFO]  raft: Node at 127.0.0.1:10012 [Follower] entering Follower state (Leader: "")
TestRTTCommand_LAN - 2019/12/30 19:11:33.421271 [INFO] serf: EventMemberJoin: Node 1dd4207a-bbf6-c838-7568-0717d84d4f4a.dc1 127.0.0.1
TestRTTCommand_LAN - 2019/12/30 19:11:33.431959 [INFO] serf: EventMemberJoin: Node 1dd4207a-bbf6-c838-7568-0717d84d4f4a 127.0.0.1
TestRTTCommand_LAN - 2019/12/30 19:11:33.433508 [INFO] consul: Adding LAN server Node 1dd4207a-bbf6-c838-7568-0717d84d4f4a (Addr: tcp/127.0.0.1:10012) (DC: dc1)
TestRTTCommand_LAN - 2019/12/30 19:11:33.434112 [INFO] consul: Handled member-join event for server "Node 1dd4207a-bbf6-c838-7568-0717d84d4f4a.dc1" in area "wan"
TestRTTCommand_LAN - 2019/12/30 19:11:33.436315 [INFO] agent: Started DNS server 127.0.0.1:10007 (tcp)
TestRTTCommand_LAN - 2019/12/30 19:11:33.436725 [INFO] agent: Started DNS server 127.0.0.1:10007 (udp)
TestRTTCommand_LAN - 2019/12/30 19:11:33.439248 [INFO] agent: Started HTTP server on 127.0.0.1:10008 (tcp)
TestRTTCommand_LAN - 2019/12/30 19:11:33.439361 [INFO] agent: started state syncer
2019/12/30 19:11:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:11:33 [INFO]  raft: Node at 127.0.0.1:10012 [Candidate] entering Candidate state in term 2
2019/12/30 19:11:33 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:11:33 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestRTTCommand_WAN - 2019/12/30 19:11:33.908460 [INFO] consul: cluster leadership acquired
TestRTTCommand_WAN - 2019/12/30 19:11:33.909066 [INFO] consul: New leader elected: Node 5e1fbf12-c10d-dc91-03c4-80f599459a79
2019/12/30 19:11:34 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:11:34 [INFO]  raft: Node at 127.0.0.1:10012 [Leader] entering Leader state
TestRTTCommand_LAN - 2019/12/30 19:11:34.121884 [INFO] consul: cluster leadership acquired
TestRTTCommand_LAN - 2019/12/30 19:11:34.122301 [INFO] consul: New leader elected: Node 1dd4207a-bbf6-c838-7568-0717d84d4f4a
TestRTTCommand_LAN - 2019/12/30 19:11:34.464234 [INFO] agent: Synced node info
TestRTTCommand_LAN - 2019/12/30 19:11:34.464347 [DEBUG] agent: Node info in sync
TestRTTCommand_WAN - 2019/12/30 19:11:34.466538 [INFO] agent: Synced node info
TestRTTCommand_WAN - 2019/12/30 19:11:34.500802 [DEBUG] http: Request GET /v1/coordinate/datacenters (4.495788ms) from=127.0.0.1:52684
TestRTTCommand_WAN - 2019/12/30 19:11:34.644128 [DEBUG] http: Request GET /v1/agent/self (129.295141ms) from=127.0.0.1:52686
TestRTTCommand_WAN - 2019/12/30 19:11:34.663182 [DEBUG] http: Request GET /v1/coordinate/datacenters (949.693µs) from=127.0.0.1:52686
TestRTTCommand_WAN - 2019/12/30 19:11:34.716725 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.329702ms) from=127.0.0.1:52688
TestRTTCommand_WAN - 2019/12/30 19:11:34.718530 [INFO] agent: Requesting shutdown
TestRTTCommand_WAN - 2019/12/30 19:11:34.718623 [INFO] consul: shutting down server
TestRTTCommand_WAN - 2019/12/30 19:11:34.718677 [WARN] serf: Shutdown without a Leave
TestRTTCommand_WAN - 2019/12/30 19:11:34.779879 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/12/30 19:11:34.787170 [DEBUG] http: Request GET /v1/coordinate/nodes (1.179698ms) from=127.0.0.1:47756
TestRTTCommand_LAN - 2019/12/30 19:11:34.817991 [DEBUG] http: Request GET /v1/coordinate/nodes (865.357µs) from=127.0.0.1:47758
TestRTTCommand_WAN - 2019/12/30 19:11:34.847037 [INFO] manager: shutting down
TestRTTCommand_WAN - 2019/12/30 19:11:34.847432 [INFO] agent: consul server down
TestRTTCommand_WAN - 2019/12/30 19:11:34.847483 [INFO] agent: shutdown complete
TestRTTCommand_WAN - 2019/12/30 19:11:34.847535 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestRTTCommand_WAN - 2019/12/30 19:11:34.847636 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestRTTCommand_WAN - 2019/12/30 19:11:34.847670 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestRTTCommand_WAN - 2019/12/30 19:11:34.847831 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestRTTCommand_WAN - 2019/12/30 19:11:34.848637 [INFO] agent: Waiting for endpoints to shut down
TestRTTCommand_WAN - 2019/12/30 19:11:34.848800 [INFO] agent: Endpoints down
--- PASS: TestRTTCommand_WAN (3.42s)
TestRTTCommand_LAN - 2019/12/30 19:11:34.859662 [DEBUG] http: Request GET /v1/coordinate/nodes (9.07591ms) from=127.0.0.1:47760
TestRTTCommand_LAN - 2019/12/30 19:11:34.895400 [DEBUG] http: Request GET /v1/coordinate/nodes (1.433705ms) from=127.0.0.1:47762
TestRTTCommand_LAN - 2019/12/30 19:11:34.925801 [DEBUG] http: Request GET /v1/coordinate/nodes (953.692µs) from=127.0.0.1:47764
TestRTTCommand_LAN - 2019/12/30 19:11:34.955912 [DEBUG] http: Request GET /v1/coordinate/nodes (901.358µs) from=127.0.0.1:47766
TestRTTCommand_LAN - 2019/12/30 19:11:34.986557 [DEBUG] http: Request GET /v1/coordinate/nodes (1.326035ms) from=127.0.0.1:47768
TestRTTCommand_LAN - 2019/12/30 19:11:35.016613 [DEBUG] http: Request GET /v1/coordinate/nodes (845.356µs) from=127.0.0.1:47770
TestRTTCommand_LAN - 2019/12/30 19:11:35.046805 [DEBUG] http: Request GET /v1/coordinate/nodes (893.691µs) from=127.0.0.1:47772
TestRTTCommand_LAN - 2019/12/30 19:11:35.077007 [DEBUG] http: Request GET /v1/coordinate/nodes (912.691µs) from=127.0.0.1:47774
TestRTTCommand_LAN - 2019/12/30 19:11:35.106950 [DEBUG] http: Request GET /v1/coordinate/nodes (904.358µs) from=127.0.0.1:47776
TestRTTCommand_LAN - 2019/12/30 19:11:35.137207 [DEBUG] http: Request GET /v1/coordinate/nodes (984.36µs) from=127.0.0.1:47778
TestRTTCommand_LAN - 2019/12/30 19:11:35.167205 [DEBUG] http: Request GET /v1/coordinate/nodes (972.359µs) from=127.0.0.1:47780
TestRTTCommand_LAN - 2019/12/30 19:11:35.197216 [DEBUG] http: Request GET /v1/coordinate/nodes (895.024µs) from=127.0.0.1:47782
TestRTTCommand_LAN - 2019/12/30 19:11:35.227370 [DEBUG] http: Request GET /v1/coordinate/nodes (906.025µs) from=127.0.0.1:47784
TestRTTCommand_LAN - 2019/12/30 19:11:35.257812 [DEBUG] http: Request GET /v1/coordinate/nodes (1.039694ms) from=127.0.0.1:47786
TestRTTCommand_LAN - 2019/12/30 19:11:35.287853 [DEBUG] http: Request GET /v1/coordinate/nodes (985.026µs) from=127.0.0.1:47788
TestRTTCommand_LAN - 2019/12/30 19:11:35.418552 [DEBUG] http: Request GET /v1/agent/self (125.059027ms) from=127.0.0.1:47790
TestRTTCommand_LAN - 2019/12/30 19:11:35.434265 [DEBUG] http: Request GET /v1/coordinate/nodes (872.357µs) from=127.0.0.1:47790
TestRTTCommand_LAN - 2019/12/30 19:11:35.440277 [DEBUG] http: Request GET /v1/coordinate/nodes (1.11303ms) from=127.0.0.1:47792
TestRTTCommand_LAN - 2019/12/30 19:11:35.442058 [INFO] agent: Requesting shutdown
TestRTTCommand_LAN - 2019/12/30 19:11:35.442151 [INFO] consul: shutting down server
TestRTTCommand_LAN - 2019/12/30 19:11:35.442197 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/12/30 19:11:35.538938 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/12/30 19:11:35.613596 [INFO] manager: shutting down
TestRTTCommand_LAN - 2019/12/30 19:11:35.614205 [INFO] agent: consul server down
TestRTTCommand_LAN - 2019/12/30 19:11:35.614258 [INFO] agent: shutdown complete
TestRTTCommand_LAN - 2019/12/30 19:11:35.614311 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (tcp)
TestRTTCommand_LAN - 2019/12/30 19:11:35.614514 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (udp)
TestRTTCommand_LAN - 2019/12/30 19:11:35.614661 [INFO] agent: Stopping HTTP server 127.0.0.1:10008 (tcp)
TestRTTCommand_LAN - 2019/12/30 19:11:35.617696 [INFO] agent: Waiting for endpoints to shut down
TestRTTCommand_LAN - 2019/12/30 19:11:35.618007 [ERR] connect: Apply failed leadership lost while committing log
TestRTTCommand_LAN - 2019/12/30 19:11:35.618051 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestRTTCommand_LAN - 2019/12/30 19:11:35.618219 [INFO] agent: Endpoints down
--- PASS: TestRTTCommand_LAN (4.19s)
PASS
ok  	github.com/hashicorp/consul/command/rtt	4.547s
=== RUN   TestDevModeHasNoServices
=== PAUSE TestDevModeHasNoServices
=== RUN   TestStructsToAgentService
=== PAUSE TestStructsToAgentService
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== CONT  TestDevModeHasNoServices
=== CONT  TestCommand_noTabs
=== CONT  TestStructsToAgentService
=== RUN   TestStructsToAgentService/Basic_service_with_port
=== PAUSE TestStructsToAgentService/Basic_service_with_port
=== RUN   TestStructsToAgentService/Service_with_a_check
=== PAUSE TestStructsToAgentService/Service_with_a_check
=== RUN   TestStructsToAgentService/Service_with_checks
=== PAUSE TestStructsToAgentService/Service_with_checks
=== RUN   TestStructsToAgentService/Proxy_service
=== PAUSE TestStructsToAgentService/Proxy_service
=== CONT  TestStructsToAgentService/Basic_service_with_port
=== CONT  TestStructsToAgentService/Proxy_service
=== CONT  TestStructsToAgentService/Service_with_checks
=== CONT  TestStructsToAgentService/Service_with_a_check
--- PASS: TestCommand_noTabs (0.01s)
--- PASS: TestStructsToAgentService (0.00s)
    --- PASS: TestStructsToAgentService/Basic_service_with_port (0.00s)
    --- PASS: TestStructsToAgentService/Proxy_service (0.00s)
    --- PASS: TestStructsToAgentService/Service_with_a_check (0.00s)
    --- PASS: TestStructsToAgentService/Service_with_checks (0.01s)
--- PASS: TestDevModeHasNoServices (0.07s)
PASS
ok  	github.com/hashicorp/consul/command/services	0.230s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_File_id
=== PAUSE TestCommand_File_id
=== RUN   TestCommand_File_nameOnly
=== PAUSE TestCommand_File_nameOnly
=== RUN   TestCommand_Flag
=== PAUSE TestCommand_Flag
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_File_nameOnly
=== CONT  TestCommand_Flag
=== CONT  TestCommand_File_id
=== CONT  TestCommand_Validation
--- PASS: TestCommand_noTabs (0.01s)
=== RUN   TestCommand_Validation/no_args_or_id
=== RUN   TestCommand_Validation/args_and_-id
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/no_args_or_id (0.00s)
    --- PASS: TestCommand_Validation/args_and_-id (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File_id - 2019/12/30 19:12:06.232913 [WARN] agent: Node name "Node d54e9ec5-788c-a8b7-a058-ac8cc754021e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File_id - 2019/12/30 19:12:06.242823 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File_nameOnly - 2019/12/30 19:12:06.243997 [WARN] agent: Node name "Node 89dbd23e-1741-7912-7a8b-937c53e955bf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File_nameOnly - 2019/12/30 19:12:06.244546 [DEBUG] tlsutil: Update with version 1
TestCommand_File_nameOnly - 2019/12/30 19:12:06.250244 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File_id - 2019/12/30 19:12:06.260064 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_Flag - 2019/12/30 19:12:06.276912 [WARN] agent: Node name "Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_Flag - 2019/12/30 19:12:06.277420 [DEBUG] tlsutil: Update with version 1
TestCommand_Flag - 2019/12/30 19:12:06.280931 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:12:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:89dbd23e-1741-7912-7a8b-937c53e955bf Address:127.0.0.1:52012}]
2019/12/30 19:12:07 [INFO]  raft: Node at 127.0.0.1:52012 [Follower] entering Follower state (Leader: "")
TestCommand_File_nameOnly - 2019/12/30 19:12:07.260870 [INFO] serf: EventMemberJoin: Node 89dbd23e-1741-7912-7a8b-937c53e955bf.dc1 127.0.0.1
TestCommand_File_nameOnly - 2019/12/30 19:12:07.264680 [INFO] serf: EventMemberJoin: Node 89dbd23e-1741-7912-7a8b-937c53e955bf 127.0.0.1
TestCommand_File_nameOnly - 2019/12/30 19:12:07.265609 [INFO] consul: Adding LAN server Node 89dbd23e-1741-7912-7a8b-937c53e955bf (Addr: tcp/127.0.0.1:52012) (DC: dc1)
TestCommand_File_nameOnly - 2019/12/30 19:12:07.265962 [INFO] consul: Handled member-join event for server "Node 89dbd23e-1741-7912-7a8b-937c53e955bf.dc1" in area "wan"
TestCommand_File_nameOnly - 2019/12/30 19:12:07.266506 [INFO] agent: Started DNS server 127.0.0.1:52007 (tcp)
TestCommand_File_nameOnly - 2019/12/30 19:12:07.266878 [INFO] agent: Started DNS server 127.0.0.1:52007 (udp)
TestCommand_File_nameOnly - 2019/12/30 19:12:07.269625 [INFO] agent: Started HTTP server on 127.0.0.1:52008 (tcp)
TestCommand_File_nameOnly - 2019/12/30 19:12:07.269762 [INFO] agent: started state syncer
2019/12/30 19:12:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:07 [INFO]  raft: Node at 127.0.0.1:52012 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d54e9ec5-788c-a8b7-a058-ac8cc754021e Address:127.0.0.1:52018}]
2019/12/30 19:12:07 [INFO]  raft: Node at 127.0.0.1:52018 [Follower] entering Follower state (Leader: "")
TestCommand_File_id - 2019/12/30 19:12:07.360923 [INFO] serf: EventMemberJoin: Node d54e9ec5-788c-a8b7-a058-ac8cc754021e.dc1 127.0.0.1
TestCommand_File_id - 2019/12/30 19:12:07.364537 [INFO] serf: EventMemberJoin: Node d54e9ec5-788c-a8b7-a058-ac8cc754021e 127.0.0.1
TestCommand_File_id - 2019/12/30 19:12:07.365530 [INFO] consul: Handled member-join event for server "Node d54e9ec5-788c-a8b7-a058-ac8cc754021e.dc1" in area "wan"
TestCommand_File_id - 2019/12/30 19:12:07.365636 [INFO] consul: Adding LAN server Node d54e9ec5-788c-a8b7-a058-ac8cc754021e (Addr: tcp/127.0.0.1:52018) (DC: dc1)
TestCommand_File_id - 2019/12/30 19:12:07.366948 [INFO] agent: Started DNS server 127.0.0.1:52013 (tcp)
TestCommand_File_id - 2019/12/30 19:12:07.367622 [INFO] agent: Started DNS server 127.0.0.1:52013 (udp)
TestCommand_File_id - 2019/12/30 19:12:07.370325 [INFO] agent: Started HTTP server on 127.0.0.1:52014 (tcp)
TestCommand_File_id - 2019/12/30 19:12:07.370443 [INFO] agent: started state syncer
2019/12/30 19:12:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:07 [INFO]  raft: Node at 127.0.0.1:52018 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6e09d437-bd3c-391f-a42d-e6bbc35616d1 Address:127.0.0.1:52006}]
2019/12/30 19:12:07 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestCommand_Flag - 2019/12/30 19:12:07.510100 [INFO] serf: EventMemberJoin: Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1.dc1 127.0.0.1
TestCommand_Flag - 2019/12/30 19:12:07.557415 [INFO] serf: EventMemberJoin: Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1 127.0.0.1
TestCommand_Flag - 2019/12/30 19:12:07.558414 [INFO] consul: Adding LAN server Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestCommand_Flag - 2019/12/30 19:12:07.558750 [INFO] consul: Handled member-join event for server "Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1.dc1" in area "wan"
TestCommand_Flag - 2019/12/30 19:12:07.558837 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestCommand_Flag - 2019/12/30 19:12:07.559150 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
2019/12/30 19:12:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:07 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
TestCommand_Flag - 2019/12/30 19:12:07.566181 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestCommand_Flag - 2019/12/30 19:12:07.566339 [INFO] agent: started state syncer
2019/12/30 19:12:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:08 [INFO]  raft: Node at 127.0.0.1:52012 [Leader] entering Leader state
TestCommand_File_nameOnly - 2019/12/30 19:12:08.389464 [INFO] consul: cluster leadership acquired
TestCommand_File_nameOnly - 2019/12/30 19:12:08.390213 [INFO] consul: New leader elected: Node 89dbd23e-1741-7912-7a8b-937c53e955bf
2019/12/30 19:12:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:08 [INFO]  raft: Node at 127.0.0.1:52018 [Leader] entering Leader state
TestCommand_File_id - 2019/12/30 19:12:08.474376 [INFO] consul: cluster leadership acquired
TestCommand_File_id - 2019/12/30 19:12:08.474881 [INFO] consul: New leader elected: Node d54e9ec5-788c-a8b7-a058-ac8cc754021e
2019/12/30 19:12:08 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:08 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestCommand_Flag - 2019/12/30 19:12:08.564935 [INFO] consul: cluster leadership acquired
TestCommand_Flag - 2019/12/30 19:12:08.565521 [INFO] consul: New leader elected: Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1
TestCommand_File_nameOnly - 2019/12/30 19:12:08.806747 [INFO] agent: Synced node info
TestCommand_Flag - 2019/12/30 19:12:08.898090 [INFO] agent: Synced node info
TestCommand_File_id - 2019/12/30 19:12:08.957446 [INFO] agent: Synced service "web"
TestCommand_File_id - 2019/12/30 19:12:08.957531 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/30 19:12:08.957628 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/30 19:12:08.957666 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/30 19:12:08.957791 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/30 19:12:08.957833 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/30 19:12:08.957917 [DEBUG] http: Request PUT /v1/agent/service/register (424.021045ms) from=127.0.0.1:50824
TestCommand_File_id - 2019/12/30 19:12:09.033625 [DEBUG] agent: Service "web" in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:09.206899 [INFO] agent: Synced service "web"
TestCommand_File_nameOnly - 2019/12/30 19:12:09.207008 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:09.207087 [DEBUG] http: Request PUT /v1/agent/service/register (681.955966ms) from=127.0.0.1:45828
TestCommand_Flag - 2019/12/30 19:12:09.206899 [INFO] agent: Synced service "web"
TestCommand_Flag - 2019/12/30 19:12:09.207645 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/30 19:12:09.207723 [DEBUG] http: Request PUT /v1/agent/service/register (452.797483ms) from=127.0.0.1:36134
TestCommand_File_nameOnly - 2019/12/30 19:12:09.283620 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/30 19:12:09.399368 [INFO] agent: Synced service "db"
TestCommand_File_id - 2019/12/30 19:12:09.399491 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/30 19:12:09.399561 [DEBUG] http: Request PUT /v1/agent/service/register (439.948472ms) from=127.0.0.1:50824
TestCommand_File_id - 2019/12/30 19:12:09.535576 [DEBUG] agent: removed service "web"
TestCommand_Flag - 2019/12/30 19:12:09.907208 [INFO] agent: Synced service "db"
TestCommand_Flag - 2019/12/30 19:12:09.907291 [DEBUG] agent: Service "web" in sync
TestCommand_Flag - 2019/12/30 19:12:09.907329 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/30 19:12:09.907407 [DEBUG] http: Request PUT /v1/agent/service/register (698.119733ms) from=127.0.0.1:36134
TestCommand_Flag - 2019/12/30 19:12:09.907938 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/30 19:12:09.923474 [INFO] agent: Deregistered service "web"
TestCommand_File_id - 2019/12/30 19:12:09.923555 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/12/30 19:12:09.923589 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/30 19:12:09.923657 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (388.753098ms) from=127.0.0.1:50828
TestCommand_File_nameOnly - 2019/12/30 19:12:09.923786 [INFO] agent: Synced service "db"
TestCommand_File_nameOnly - 2019/12/30 19:12:09.923832 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:09.923898 [DEBUG] http: Request PUT /v1/agent/service/register (714.619509ms) from=127.0.0.1:45828
TestCommand_File_nameOnly - 2019/12/30 19:12:09.924295 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/30 19:12:09.932962 [DEBUG] http: Request GET /v1/agent/services (7.635205ms) from=127.0.0.1:50824
TestCommand_File_id - 2019/12/30 19:12:09.952218 [INFO] agent: Requesting shutdown
TestCommand_File_id - 2019/12/30 19:12:09.955250 [INFO] consul: shutting down server
TestCommand_File_id - 2019/12/30 19:12:09.955448 [WARN] serf: Shutdown without a Leave
TestCommand_File_id - 2019/12/30 19:12:10.049222 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/12/30 19:12:10.050293 [INFO] agent: Synced service "db"
TestCommand_Flag - 2019/12/30 19:12:10.050362 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/30 19:12:10.050480 [DEBUG] agent: Service "web" in sync
TestCommand_Flag - 2019/12/30 19:12:10.050523 [DEBUG] agent: Service "db" in sync
TestCommand_Flag - 2019/12/30 19:12:10.050554 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/30 19:12:10.051315 [DEBUG] agent: removed service "web"
TestCommand_File_id - 2019/12/30 19:12:10.161449 [INFO] manager: shutting down
TestCommand_File_nameOnly - 2019/12/30 19:12:10.439913 [INFO] agent: Synced service "db"
TestCommand_File_nameOnly - 2019/12/30 19:12:10.439995 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.440137 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.440185 [DEBUG] agent: Service "web" in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.440216 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.440382 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.441256 [DEBUG] agent: removed service "web"
TestCommand_File_id - 2019/12/30 19:12:10.442567 [INFO] agent: consul server down
TestCommand_File_id - 2019/12/30 19:12:10.442643 [INFO] agent: shutdown complete
TestCommand_File_id - 2019/12/30 19:12:10.442776 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (tcp)
TestCommand_File_id - 2019/12/30 19:12:10.442918 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (udp)
TestCommand_File_id - 2019/12/30 19:12:10.443064 [INFO] agent: Stopping HTTP server 127.0.0.1:52014 (tcp)
TestCommand_File_id - 2019/12/30 19:12:10.443681 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File_id - 2019/12/30 19:12:10.443781 [ERR] connect: Apply failed leadership lost while committing log
TestCommand_File_id - 2019/12/30 19:12:10.443820 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_File_id - 2019/12/30 19:12:10.444000 [INFO] agent: Endpoints down
--- PASS: TestCommand_File_id (4.38s)
TestCommand_Flag - 2019/12/30 19:12:10.673540 [INFO] agent: Deregistered service "web"
TestCommand_Flag - 2019/12/30 19:12:10.673632 [DEBUG] agent: Service "db" in sync
TestCommand_Flag - 2019/12/30 19:12:10.673670 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/30 19:12:10.673775 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (750.036458ms) from=127.0.0.1:36138
TestCommand_Flag - 2019/12/30 19:12:10.674053 [DEBUG] agent: Service "db" in sync
TestCommand_Flag - 2019/12/30 19:12:10.674121 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/30 19:12:10.674093 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommand_Flag - 2019/12/30 19:12:10.674719 [DEBUG] consul: Skipping self join check for "Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1" since the cluster is too small
TestCommand_Flag - 2019/12/30 19:12:10.674897 [INFO] consul: member 'Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1' joined, marking health alive
TestCommand_Flag - 2019/12/30 19:12:10.676630 [DEBUG] http: Request GET /v1/agent/services (1.203365ms) from=127.0.0.1:36134
TestCommand_Flag - 2019/12/30 19:12:10.678618 [INFO] agent: Requesting shutdown
TestCommand_Flag - 2019/12/30 19:12:10.678721 [INFO] consul: shutting down server
TestCommand_Flag - 2019/12/30 19:12:10.678769 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/12/30 19:12:10.740108 [INFO] agent: Deregistered service "web"
TestCommand_File_nameOnly - 2019/12/30 19:12:10.740195 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.740326 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.740372 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/30 19:12:10.740434 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (700.147787ms) from=127.0.0.1:45838
TestCommand_File_nameOnly - 2019/12/30 19:12:10.743165 [DEBUG] http: Request GET /v1/agent/services (804.022µs) from=127.0.0.1:45828
TestCommand_File_nameOnly - 2019/12/30 19:12:10.746094 [INFO] agent: Requesting shutdown
TestCommand_File_nameOnly - 2019/12/30 19:12:10.746193 [INFO] consul: shutting down server
TestCommand_File_nameOnly - 2019/12/30 19:12:10.746237 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/12/30 19:12:10.830652 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/12/30 19:12:10.831326 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/12/30 19:12:10.907245 [INFO] manager: shutting down
TestCommand_Flag - 2019/12/30 19:12:10.907764 [INFO] agent: consul server down
TestCommand_Flag - 2019/12/30 19:12:10.907812 [INFO] agent: shutdown complete
TestCommand_Flag - 2019/12/30 19:12:10.907857 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestCommand_Flag - 2019/12/30 19:12:10.907972 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestCommand_Flag - 2019/12/30 19:12:10.908099 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestCommand_Flag - 2019/12/30 19:12:10.908639 [INFO] agent: Waiting for endpoints to shut down
TestCommand_Flag - 2019/12/30 19:12:10.908944 [ERR] consul: failed to reconcile member: {Node 6e09d437-bd3c-391f-a42d-e6bbc35616d1 127.0.0.1 52004 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:6e09d437-bd3c-391f-a42d-e6bbc35616d1 port:52006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:52005] alive 1 5 2 2 5 4}: leadership lost while committing log
TestCommand_Flag - 2019/12/30 19:12:10.908959 [INFO] agent: Endpoints down
--- PASS: TestCommand_Flag (4.85s)
TestCommand_Flag - 2019/12/30 19:12:10.909298 [INFO] manager: shutting down
TestCommand_File_nameOnly - 2019/12/30 19:12:10.909792 [ERR] connect: Apply failed leadership lost while committing log
TestCommand_File_nameOnly - 2019/12/30 19:12:10.909852 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_File_nameOnly - 2019/12/30 19:12:10.910375 [INFO] agent: consul server down
TestCommand_File_nameOnly - 2019/12/30 19:12:10.910732 [INFO] agent: shutdown complete
TestCommand_File_nameOnly - 2019/12/30 19:12:10.911037 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (tcp)
TestCommand_File_nameOnly - 2019/12/30 19:12:10.911401 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (udp)
TestCommand_File_nameOnly - 2019/12/30 19:12:10.911739 [INFO] agent: Stopping HTTP server 127.0.0.1:52008 (tcp)
TestCommand_File_nameOnly - 2019/12/30 19:12:10.912464 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File_nameOnly - 2019/12/30 19:12:10.912789 [INFO] agent: Endpoints down
--- PASS: TestCommand_File_nameOnly (4.85s)
PASS
ok  	github.com/hashicorp/consul/command/services/deregister	5.150s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_File
=== PAUSE TestCommand_File
=== RUN   TestCommand_Flags
=== PAUSE TestCommand_Flags
=== CONT  TestCommand_Flags
=== CONT  TestCommand_Validation
=== CONT  TestCommand_noTabs
=== RUN   TestCommand_Validation/no_args_or_id
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestCommand_File
=== RUN   TestCommand_Validation/args_and_-name
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/no_args_or_id (0.00s)
    --- PASS: TestCommand_Validation/args_and_-name (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_Flags - 2019/12/30 19:12:13.185033 [WARN] agent: Node name "Node 800c1baf-85a1-8738-76ad-a8c0bb8fe18e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_Flags - 2019/12/30 19:12:13.186301 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File - 2019/12/30 19:12:13.193071 [WARN] agent: Node name "Node 9b1d7fbe-2b90-fc2d-83a3-9942ec3167af" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File - 2019/12/30 19:12:13.193747 [DEBUG] tlsutil: Update with version 1
TestCommand_File - 2019/12/30 19:12:13.199633 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_Flags - 2019/12/30 19:12:13.205691 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:12:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9b1d7fbe-2b90-fc2d-83a3-9942ec3167af Address:127.0.0.1:38512}]
2019/12/30 19:12:14 [INFO]  raft: Node at 127.0.0.1:38512 [Follower] entering Follower state (Leader: "")
TestCommand_File - 2019/12/30 19:12:14.470084 [INFO] serf: EventMemberJoin: Node 9b1d7fbe-2b90-fc2d-83a3-9942ec3167af.dc1 127.0.0.1
TestCommand_File - 2019/12/30 19:12:14.473773 [INFO] serf: EventMemberJoin: Node 9b1d7fbe-2b90-fc2d-83a3-9942ec3167af 127.0.0.1
TestCommand_File - 2019/12/30 19:12:14.475305 [INFO] consul: Adding LAN server Node 9b1d7fbe-2b90-fc2d-83a3-9942ec3167af (Addr: tcp/127.0.0.1:38512) (DC: dc1)
TestCommand_File - 2019/12/30 19:12:14.475775 [INFO] consul: Handled member-join event for server "Node 9b1d7fbe-2b90-fc2d-83a3-9942ec3167af.dc1" in area "wan"
TestCommand_File - 2019/12/30 19:12:14.476110 [INFO] agent: Started DNS server 127.0.0.1:38507 (udp)
TestCommand_File - 2019/12/30 19:12:14.476344 [INFO] agent: Started DNS server 127.0.0.1:38507 (tcp)
TestCommand_File - 2019/12/30 19:12:14.479532 [INFO] agent: Started HTTP server on 127.0.0.1:38508 (tcp)
TestCommand_File - 2019/12/30 19:12:14.479650 [INFO] agent: started state syncer
2019/12/30 19:12:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:14 [INFO]  raft: Node at 127.0.0.1:38512 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:800c1baf-85a1-8738-76ad-a8c0bb8fe18e Address:127.0.0.1:38506}]
2019/12/30 19:12:14 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestCommand_Flags - 2019/12/30 19:12:14.619305 [INFO] serf: EventMemberJoin: Node 800c1baf-85a1-8738-76ad-a8c0bb8fe18e.dc1 127.0.0.1
TestCommand_Flags - 2019/12/30 19:12:14.631522 [INFO] serf: EventMemberJoin: Node 800c1baf-85a1-8738-76ad-a8c0bb8fe18e 127.0.0.1
TestCommand_Flags - 2019/12/30 19:12:14.633292 [INFO] consul: Adding LAN server Node 800c1baf-85a1-8738-76ad-a8c0bb8fe18e (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestCommand_Flags - 2019/12/30 19:12:14.633415 [INFO] consul: Handled member-join event for server "Node 800c1baf-85a1-8738-76ad-a8c0bb8fe18e.dc1" in area "wan"
TestCommand_Flags - 2019/12/30 19:12:14.635357 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestCommand_Flags - 2019/12/30 19:12:14.635891 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestCommand_Flags - 2019/12/30 19:12:14.638830 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestCommand_Flags - 2019/12/30 19:12:14.639033 [INFO] agent: started state syncer
2019/12/30 19:12:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:14 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:15 [INFO]  raft: Node at 127.0.0.1:38512 [Leader] entering Leader state
TestCommand_File - 2019/12/30 19:12:15.014904 [INFO] consul: cluster leadership acquired
TestCommand_File - 2019/12/30 19:12:15.015461 [INFO] consul: New leader elected: Node 9b1d7fbe-2b90-fc2d-83a3-9942ec3167af
2019/12/30 19:12:15 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:15 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestCommand_Flags - 2019/12/30 19:12:15.299193 [INFO] consul: cluster leadership acquired
TestCommand_Flags - 2019/12/30 19:12:15.301779 [INFO] consul: New leader elected: Node 800c1baf-85a1-8738-76ad-a8c0bb8fe18e
TestCommand_File - 2019/12/30 19:12:15.365785 [INFO] agent: Synced node info
TestCommand_Flags - 2019/12/30 19:12:16.315566 [INFO] agent: Synced node info
TestCommand_File - 2019/12/30 19:12:16.623798 [INFO] agent: Synced service "web"
TestCommand_File - 2019/12/30 19:12:16.623927 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/12/30 19:12:16.624031 [DEBUG] http: Request PUT /v1/agent/service/register (1.366390321s) from=127.0.0.1:43508
TestCommand_File - 2019/12/30 19:12:16.632007 [DEBUG] http: Request GET /v1/agent/services (1.375371ms) from=127.0.0.1:43512
TestCommand_File - 2019/12/30 19:12:16.635691 [INFO] agent: Requesting shutdown
TestCommand_File - 2019/12/30 19:12:16.635808 [INFO] consul: shutting down server
TestCommand_File - 2019/12/30 19:12:16.635859 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/30 19:12:16.699056 [DEBUG] agent: Service "web" in sync
TestCommand_File - 2019/12/30 19:12:16.699147 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/12/30 19:12:16.699254 [DEBUG] agent: Service "web" in sync
TestCommand_File - 2019/12/30 19:12:16.699288 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/12/30 19:12:16.757490 [WARN] serf: Shutdown without a Leave
TestCommand_Flags - 2019/12/30 19:12:16.760487 [INFO] agent: Synced service "web"
TestCommand_Flags - 2019/12/30 19:12:16.760569 [DEBUG] agent: Node info in sync
TestCommand_Flags - 2019/12/30 19:12:16.760639 [DEBUG] http: Request PUT /v1/agent/service/register (1.138049529s) from=127.0.0.1:47952
TestCommand_Flags - 2019/12/30 19:12:16.768676 [DEBUG] http: Request GET /v1/agent/services (5.250474ms) from=127.0.0.1:47956
TestCommand_Flags - 2019/12/30 19:12:16.782335 [INFO] agent: Requesting shutdown
TestCommand_Flags - 2019/12/30 19:12:16.783621 [INFO] consul: shutting down server
TestCommand_Flags - 2019/12/30 19:12:16.783691 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/30 19:12:16.822591 [INFO] manager: shutting down
TestCommand_Flags - 2019/12/30 19:12:16.897886 [WARN] serf: Shutdown without a Leave
TestCommand_Flags - 2019/12/30 19:12:16.972621 [INFO] manager: shutting down
TestCommand_File - 2019/12/30 19:12:16.972824 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_File - 2019/12/30 19:12:16.972885 [INFO] agent: consul server down
TestCommand_File - 2019/12/30 19:12:16.972926 [INFO] agent: shutdown complete
TestCommand_File - 2019/12/30 19:12:16.972960 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommand_File - 2019/12/30 19:12:16.972980 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (tcp)
TestCommand_File - 2019/12/30 19:12:16.973177 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (udp)
TestCommand_File - 2019/12/30 19:12:16.973358 [INFO] agent: Stopping HTTP server 127.0.0.1:38508 (tcp)
TestCommand_File - 2019/12/30 19:12:16.974164 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File - 2019/12/30 19:12:16.974226 [INFO] agent: Endpoints down
--- PASS: TestCommand_File (3.94s)
TestCommand_Flags - 2019/12/30 19:12:17.040913 [INFO] agent: consul server down
TestCommand_Flags - 2019/12/30 19:12:17.040991 [INFO] agent: shutdown complete
TestCommand_Flags - 2019/12/30 19:12:17.041047 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestCommand_Flags - 2019/12/30 19:12:17.041187 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestCommand_Flags - 2019/12/30 19:12:17.041373 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestCommand_Flags - 2019/12/30 19:12:17.042045 [INFO] agent: Waiting for endpoints to shut down
TestCommand_Flags - 2019/12/30 19:12:17.042169 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_Flags - 2019/12/30 19:12:17.042358 [INFO] agent: Endpoints down
--- PASS: TestCommand_Flags (4.01s)
PASS
ok  	github.com/hashicorp/consul/command/services/register	4.330s
=== RUN   TestSnapshotCommand_noTabs
=== PAUSE TestSnapshotCommand_noTabs
=== CONT  TestSnapshotCommand_noTabs
--- PASS: TestSnapshotCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot	0.044s
=== RUN   TestSnapshotInspectCommand_noTabs
=== PAUSE TestSnapshotInspectCommand_noTabs
=== RUN   TestSnapshotInspectCommand_Validation
=== PAUSE TestSnapshotInspectCommand_Validation
=== RUN   TestSnapshotInspectCommand
=== PAUSE TestSnapshotInspectCommand
=== CONT  TestSnapshotInspectCommand_noTabs
=== CONT  TestSnapshotInspectCommand
=== CONT  TestSnapshotInspectCommand_Validation
--- PASS: TestSnapshotInspectCommand_Validation (0.00s)
--- PASS: TestSnapshotInspectCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotInspectCommand - 2019/12/30 19:12:25.167401 [WARN] agent: Node name "Node 585c3774-a1d5-5bf0-5ec3-05274bae4635" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotInspectCommand - 2019/12/30 19:12:25.168221 [DEBUG] tlsutil: Update with version 1
TestSnapshotInspectCommand - 2019/12/30 19:12:25.175158 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:12:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:585c3774-a1d5-5bf0-5ec3-05274bae4635 Address:127.0.0.1:31006}]
2019/12/30 19:12:25 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestSnapshotInspectCommand - 2019/12/30 19:12:25.928865 [INFO] serf: EventMemberJoin: Node 585c3774-a1d5-5bf0-5ec3-05274bae4635.dc1 127.0.0.1
TestSnapshotInspectCommand - 2019/12/30 19:12:25.934932 [INFO] serf: EventMemberJoin: Node 585c3774-a1d5-5bf0-5ec3-05274bae4635 127.0.0.1
TestSnapshotInspectCommand - 2019/12/30 19:12:25.936878 [INFO] consul: Adding LAN server Node 585c3774-a1d5-5bf0-5ec3-05274bae4635 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
TestSnapshotInspectCommand - 2019/12/30 19:12:25.937837 [INFO] consul: Handled member-join event for server "Node 585c3774-a1d5-5bf0-5ec3-05274bae4635.dc1" in area "wan"
TestSnapshotInspectCommand - 2019/12/30 19:12:25.939894 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestSnapshotInspectCommand - 2019/12/30 19:12:25.940773 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestSnapshotInspectCommand - 2019/12/30 19:12:25.943752 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestSnapshotInspectCommand - 2019/12/30 19:12:25.943913 [INFO] agent: started state syncer
2019/12/30 19:12:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:25 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:26 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:26 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestSnapshotInspectCommand - 2019/12/30 19:12:26.390929 [INFO] consul: cluster leadership acquired
TestSnapshotInspectCommand - 2019/12/30 19:12:26.391449 [INFO] consul: New leader elected: Node 585c3774-a1d5-5bf0-5ec3-05274bae4635
TestSnapshotInspectCommand - 2019/12/30 19:12:26.682196 [INFO] agent: Synced node info
TestSnapshotInspectCommand - 2019/12/30 19:12:26.682348 [DEBUG] agent: Node info in sync
TestSnapshotInspectCommand - 2019/12/30 19:12:26.694252 [DEBUG] agent: Node info in sync
TestSnapshotInspectCommand - 2019/12/30 19:12:27.473767 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotInspectCommand - 2019/12/30 19:12:27.474544 [DEBUG] consul: Skipping self join check for "Node 585c3774-a1d5-5bf0-5ec3-05274bae4635" since the cluster is too small
TestSnapshotInspectCommand - 2019/12/30 19:12:27.474835 [INFO] consul: member 'Node 585c3774-a1d5-5bf0-5ec3-05274bae4635' joined, marking health alive
2019/12/30 19:12:27 [INFO] consul.fsm: snapshot created in 192.672µs
2019/12/30 19:12:27 [INFO]  raft: Starting snapshot up to 10
2019/12/30 19:12:27 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotInspectCommand-agent593446060/raft/snapshots/2-10-1577733147644.tmp
2019/12/30 19:12:27 [INFO]  raft: Snapshot to 10 complete
TestSnapshotInspectCommand - 2019/12/30 19:12:27.975564 [DEBUG] http: Request GET /v1/snapshot (1.364276422s) from=127.0.0.1:34432
TestSnapshotInspectCommand - 2019/12/30 19:12:27.986830 [INFO] agent: Requesting shutdown
TestSnapshotInspectCommand - 2019/12/30 19:12:27.987074 [INFO] consul: shutting down server
TestSnapshotInspectCommand - 2019/12/30 19:12:27.988023 [WARN] serf: Shutdown without a Leave
TestSnapshotInspectCommand - 2019/12/30 19:12:28.047714 [WARN] serf: Shutdown without a Leave
TestSnapshotInspectCommand - 2019/12/30 19:12:28.097814 [INFO] manager: shutting down
TestSnapshotInspectCommand - 2019/12/30 19:12:28.098266 [INFO] agent: consul server down
TestSnapshotInspectCommand - 2019/12/30 19:12:28.098319 [INFO] agent: shutdown complete
TestSnapshotInspectCommand - 2019/12/30 19:12:28.098370 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestSnapshotInspectCommand - 2019/12/30 19:12:28.098505 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestSnapshotInspectCommand - 2019/12/30 19:12:28.098647 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestSnapshotInspectCommand - 2019/12/30 19:12:28.099097 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotInspectCommand - 2019/12/30 19:12:28.099254 [INFO] agent: Endpoints down
--- PASS: TestSnapshotInspectCommand (3.00s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/inspect	3.316s
=== RUN   TestSnapshotRestoreCommand_noTabs
=== PAUSE TestSnapshotRestoreCommand_noTabs
=== RUN   TestSnapshotRestoreCommand_Validation
=== PAUSE TestSnapshotRestoreCommand_Validation
=== RUN   TestSnapshotRestoreCommand
=== PAUSE TestSnapshotRestoreCommand
=== CONT  TestSnapshotRestoreCommand_noTabs
=== CONT  TestSnapshotRestoreCommand
--- PASS: TestSnapshotRestoreCommand_noTabs (0.02s)
=== CONT  TestSnapshotRestoreCommand_Validation
--- PASS: TestSnapshotRestoreCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotRestoreCommand - 2019/12/30 19:12:30.966308 [WARN] agent: Node name "Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotRestoreCommand - 2019/12/30 19:12:30.967664 [DEBUG] tlsutil: Update with version 1
TestSnapshotRestoreCommand - 2019/12/30 19:12:30.976987 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:12:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4acf5ce7-edc6-4560-cdfb-82f55bc3abc2 Address:127.0.0.1:40006}]
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.703235 [INFO] serf: EventMemberJoin: Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2.dc1 127.0.0.1
2019/12/30 19:12:31 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.724190 [INFO] serf: EventMemberJoin: Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2 127.0.0.1
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.726407 [INFO] consul: Adding LAN server Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2 (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.726742 [INFO] consul: Handled member-join event for server "Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2.dc1" in area "wan"
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.734577 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.734668 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.739517 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestSnapshotRestoreCommand - 2019/12/30 19:12:31.739719 [INFO] agent: started state syncer
2019/12/30 19:12:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:31 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:32 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:32 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestSnapshotRestoreCommand - 2019/12/30 19:12:32.148331 [INFO] consul: cluster leadership acquired
TestSnapshotRestoreCommand - 2019/12/30 19:12:32.148945 [INFO] consul: New leader elected: Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2
TestSnapshotRestoreCommand - 2019/12/30 19:12:32.407381 [INFO] agent: Synced node info
TestSnapshotRestoreCommand - 2019/12/30 19:12:33.198666 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotRestoreCommand - 2019/12/30 19:12:33.199197 [DEBUG] consul: Skipping self join check for "Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2" since the cluster is too small
TestSnapshotRestoreCommand - 2019/12/30 19:12:33.199378 [INFO] consul: member 'Node 4acf5ce7-edc6-4560-cdfb-82f55bc3abc2' joined, marking health alive
2019/12/30 19:12:33 [INFO] consul.fsm: snapshot created in 246.34µs
2019/12/30 19:12:33 [INFO]  raft: Starting snapshot up to 9
2019/12/30 19:12:33 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotRestoreCommand-agent060591489/raft/snapshots/2-9-1577733153341.tmp
2019/12/30 19:12:33 [INFO]  raft: Snapshot to 9 complete
TestSnapshotRestoreCommand - 2019/12/30 19:12:33.628585 [DEBUG] http: Request GET /v1/snapshot (1.215875416s) from=127.0.0.1:47604
2019/12/30 19:12:33 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotRestoreCommand-agent060591489/raft/snapshots/2-11-1577733153731.tmp
2019/12/30 19:12:33 [INFO]  raft: Copied 3506 bytes to local snapshot
2019/12/30 19:12:33 [INFO]  raft: Restored user snapshot (index 11)
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.208601 [DEBUG] http: Request PUT /v1/snapshot (555.280956ms) from=127.0.0.1:47606
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.210669 [INFO] agent: Requesting shutdown
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.210779 [INFO] consul: shutting down server
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.210829 [WARN] serf: Shutdown without a Leave
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.264597 [WARN] serf: Shutdown without a Leave
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.314675 [INFO] manager: shutting down
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.315225 [INFO] agent: consul server down
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.315284 [INFO] agent: shutdown complete
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.315335 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.315462 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.315615 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.316268 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotRestoreCommand - 2019/12/30 19:12:34.316443 [INFO] agent: Endpoints down
--- PASS: TestSnapshotRestoreCommand (3.46s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/restore	3.750s
=== RUN   TestSnapshotSaveCommand_noTabs
=== PAUSE TestSnapshotSaveCommand_noTabs
=== RUN   TestSnapshotSaveCommand_Validation
=== PAUSE TestSnapshotSaveCommand_Validation
=== RUN   TestSnapshotSaveCommand
=== PAUSE TestSnapshotSaveCommand
=== CONT  TestSnapshotSaveCommand_noTabs
=== CONT  TestSnapshotSaveCommand
=== CONT  TestSnapshotSaveCommand_Validation
--- PASS: TestSnapshotSaveCommand_Validation (0.00s)
--- PASS: TestSnapshotSaveCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotSaveCommand - 2019/12/30 19:12:57.325422 [WARN] agent: Node name "Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotSaveCommand - 2019/12/30 19:12:57.326509 [DEBUG] tlsutil: Update with version 1
TestSnapshotSaveCommand - 2019/12/30 19:12:57.339531 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:12:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c0ae1bd5-718f-f0d9-abea-995fb8159dfb Address:127.0.0.1:38506}]
2019/12/30 19:12:58 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestSnapshotSaveCommand - 2019/12/30 19:12:58.027689 [INFO] serf: EventMemberJoin: Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb.dc1 127.0.0.1
TestSnapshotSaveCommand - 2019/12/30 19:12:58.033478 [INFO] serf: EventMemberJoin: Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb 127.0.0.1
TestSnapshotSaveCommand - 2019/12/30 19:12:58.035909 [INFO] consul: Adding LAN server Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestSnapshotSaveCommand - 2019/12/30 19:12:58.040129 [INFO] consul: Handled member-join event for server "Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb.dc1" in area "wan"
TestSnapshotSaveCommand - 2019/12/30 19:12:58.043441 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestSnapshotSaveCommand - 2019/12/30 19:12:58.043686 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestSnapshotSaveCommand - 2019/12/30 19:12:58.049742 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestSnapshotSaveCommand - 2019/12/30 19:12:58.050833 [INFO] agent: started state syncer
2019/12/30 19:12:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:12:58 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/12/30 19:12:58 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:12:58 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestSnapshotSaveCommand - 2019/12/30 19:12:58.477370 [INFO] consul: cluster leadership acquired
TestSnapshotSaveCommand - 2019/12/30 19:12:58.478143 [INFO] consul: New leader elected: Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb
TestSnapshotSaveCommand - 2019/12/30 19:12:59.150778 [INFO] agent: Synced node info
TestSnapshotSaveCommand - 2019/12/30 19:12:59.649189 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotSaveCommand - 2019/12/30 19:12:59.649846 [DEBUG] consul: Skipping self join check for "Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb" since the cluster is too small
TestSnapshotSaveCommand - 2019/12/30 19:12:59.650014 [INFO] consul: member 'Node c0ae1bd5-718f-f0d9-abea-995fb8159dfb' joined, marking health alive
2019/12/30 19:12:59 [INFO] consul.fsm: snapshot created in 273.008µs
2019/12/30 19:12:59 [INFO]  raft: Starting snapshot up to 9
2019/12/30 19:12:59 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotSaveCommand-agent502389149/raft/snapshots/2-9-1577733179783.tmp
2019/12/30 19:13:00 [INFO]  raft: Snapshot to 9 complete
TestSnapshotSaveCommand - 2019/12/30 19:13:00.127448 [DEBUG] http: Request GET /v1/snapshot (1.481796532s) from=127.0.0.1:47964
2019/12/30 19:13:00 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotSaveCommand-agent502389149/raft/snapshots/2-11-1577733180206.tmp
2019/12/30 19:13:00 [INFO]  raft: Copied 3506 bytes to local snapshot
2019/12/30 19:13:00 [INFO]  raft: Restored user snapshot (index 11)
TestSnapshotSaveCommand - 2019/12/30 19:13:00.767816 [DEBUG] http: Request PUT /v1/snapshot (627.523549ms) from=127.0.0.1:47966
TestSnapshotSaveCommand - 2019/12/30 19:13:00.769957 [INFO] agent: Requesting shutdown
TestSnapshotSaveCommand - 2019/12/30 19:13:00.770049 [INFO] consul: shutting down server
TestSnapshotSaveCommand - 2019/12/30 19:13:00.770096 [WARN] serf: Shutdown without a Leave
TestSnapshotSaveCommand - 2019/12/30 19:13:00.823429 [WARN] serf: Shutdown without a Leave
TestSnapshotSaveCommand - 2019/12/30 19:13:00.865161 [INFO] manager: shutting down
TestSnapshotSaveCommand - 2019/12/30 19:13:00.865602 [INFO] agent: consul server down
TestSnapshotSaveCommand - 2019/12/30 19:13:00.865650 [INFO] agent: shutdown complete
TestSnapshotSaveCommand - 2019/12/30 19:13:00.865701 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestSnapshotSaveCommand - 2019/12/30 19:13:00.865824 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestSnapshotSaveCommand - 2019/12/30 19:13:00.865966 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestSnapshotSaveCommand - 2019/12/30 19:13:00.866593 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotSaveCommand - 2019/12/30 19:13:00.866765 [INFO] agent: Endpoints down
--- PASS: TestSnapshotSaveCommand (3.65s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/save	3.938s
=== RUN   TestValidateCommand_noTabs
=== PAUSE TestValidateCommand_noTabs
=== RUN   TestValidateCommand_FailOnEmptyFile
=== PAUSE TestValidateCommand_FailOnEmptyFile
=== RUN   TestValidateCommand_SucceedOnMinimalConfigFile
=== PAUSE TestValidateCommand_SucceedOnMinimalConfigFile
=== RUN   TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== PAUSE TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== RUN   TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== PAUSE TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== RUN   TestValidateCommand_SucceedWithJSONAsHCL
=== PAUSE TestValidateCommand_SucceedWithJSONAsHCL
=== RUN   TestValidateCommand_SucceedOnMinimalConfigDir
=== PAUSE TestValidateCommand_SucceedOnMinimalConfigDir
=== RUN   TestValidateCommand_FailForInvalidJSONConfigFormat
=== PAUSE TestValidateCommand_FailForInvalidJSONConfigFormat
=== RUN   TestValidateCommand_Quiet
=== PAUSE TestValidateCommand_Quiet
=== CONT  TestValidateCommand_noTabs
=== CONT  TestValidateCommand_SucceedWithJSONAsHCL
--- PASS: TestValidateCommand_noTabs (0.00s)
=== CONT  TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== CONT  TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== CONT  TestValidateCommand_SucceedOnMinimalConfigFile
--- PASS: TestValidateCommand_SucceedWithJSONAsHCL (0.11s)
=== CONT  TestValidateCommand_FailOnEmptyFile
--- PASS: TestValidateCommand_SucceedWithMinimalHCLConfigFormat (0.12s)
=== CONT  TestValidateCommand_Quiet
--- PASS: TestValidateCommand_SucceedOnMinimalConfigFile (0.12s)
=== CONT  TestValidateCommand_FailForInvalidJSONConfigFormat
--- PASS: TestValidateCommand_SucceedWithMinimalJSONConfigFormat (0.14s)
=== CONT  TestValidateCommand_SucceedOnMinimalConfigDir
--- PASS: TestValidateCommand_FailForInvalidJSONConfigFormat (0.03s)
--- PASS: TestValidateCommand_Quiet (0.08s)
--- PASS: TestValidateCommand_FailOnEmptyFile (0.11s)
--- PASS: TestValidateCommand_SucceedOnMinimalConfigDir (0.08s)
PASS
ok  	github.com/hashicorp/consul/command/validate	0.396s
=== RUN   TestVersionCommand_noTabs
=== PAUSE TestVersionCommand_noTabs
=== CONT  TestVersionCommand_noTabs
--- PASS: TestVersionCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/version	0.264s
=== RUN   TestWatchCommand_noTabs
=== PAUSE TestWatchCommand_noTabs
=== RUN   TestWatchCommand
=== PAUSE TestWatchCommand
=== RUN   TestWatchCommand_loadToken
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommand_loadToken - 2019/12/30 19:13:24.318916 [WARN] agent: Node name "Node 0b11594a-c687-82c5-aa9c-cda73274bd14" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommand_loadToken - 2019/12/30 19:13:24.320058 [DEBUG] tlsutil: Update with version 1
TestWatchCommand_loadToken - 2019/12/30 19:13:24.327420 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:13:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0b11594a-c687-82c5-aa9c-cda73274bd14 Address:127.0.0.1:19006}]
2019/12/30 19:13:25 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestWatchCommand_loadToken - 2019/12/30 19:13:25.091070 [INFO] serf: EventMemberJoin: Node 0b11594a-c687-82c5-aa9c-cda73274bd14.dc1 127.0.0.1
TestWatchCommand_loadToken - 2019/12/30 19:13:25.097915 [INFO] serf: EventMemberJoin: Node 0b11594a-c687-82c5-aa9c-cda73274bd14 127.0.0.1
TestWatchCommand_loadToken - 2019/12/30 19:13:25.105069 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestWatchCommand_loadToken - 2019/12/30 19:13:25.110574 [INFO] consul: Adding LAN server Node 0b11594a-c687-82c5-aa9c-cda73274bd14 (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestWatchCommand_loadToken - 2019/12/30 19:13:25.112337 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestWatchCommand_loadToken - 2019/12/30 19:13:25.112386 [INFO] consul: Handled member-join event for server "Node 0b11594a-c687-82c5-aa9c-cda73274bd14.dc1" in area "wan"
TestWatchCommand_loadToken - 2019/12/30 19:13:25.119842 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestWatchCommand_loadToken - 2019/12/30 19:13:25.120101 [INFO] agent: started state syncer
2019/12/30 19:13:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:25 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:25 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:25 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestWatchCommand_loadToken - 2019/12/30 19:13:25.991587 [INFO] consul: cluster leadership acquired
TestWatchCommand_loadToken - 2019/12/30 19:13:25.992264 [INFO] consul: New leader elected: Node 0b11594a-c687-82c5-aa9c-cda73274bd14
TestWatchCommand_loadToken - 2019/12/30 19:13:26.393919 [INFO] agent: Synced node info
TestWatchCommand_loadToken - 2019/12/30 19:13:27.958569 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommand_loadToken - 2019/12/30 19:13:27.959255 [DEBUG] consul: Skipping self join check for "Node 0b11594a-c687-82c5-aa9c-cda73274bd14" since the cluster is too small
TestWatchCommand_loadToken - 2019/12/30 19:13:27.959483 [INFO] consul: member 'Node 0b11594a-c687-82c5-aa9c-cda73274bd14' joined, marking health alive
=== RUN   TestWatchCommand_loadToken/token_arg
=== RUN   TestWatchCommand_loadToken/token_env
=== RUN   TestWatchCommand_loadToken/token_file_arg
=== RUN   TestWatchCommand_loadToken/token_file_env
TestWatchCommand_loadToken - 2019/12/30 19:13:28.256722 [INFO] agent: Requesting shutdown
TestWatchCommand_loadToken - 2019/12/30 19:13:28.256916 [INFO] consul: shutting down server
TestWatchCommand_loadToken - 2019/12/30 19:13:28.257215 [WARN] serf: Shutdown without a Leave
TestWatchCommand_loadToken - 2019/12/30 19:13:28.262700 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestWatchCommand_loadToken - 2019/12/30 19:13:28.262790 [DEBUG] agent: Node info in sync
TestWatchCommand_loadToken - 2019/12/30 19:13:28.262878 [DEBUG] agent: Node info in sync
TestWatchCommand_loadToken - 2019/12/30 19:13:28.424239 [WARN] serf: Shutdown without a Leave
TestWatchCommand_loadToken - 2019/12/30 19:13:28.465739 [INFO] manager: shutting down
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466139 [INFO] agent: consul server down
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466200 [INFO] agent: shutdown complete
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466278 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466428 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466590 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466826 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommand_loadToken - 2019/12/30 19:13:28.466904 [INFO] agent: Endpoints down
--- PASS: TestWatchCommand_loadToken (4.23s)
    --- PASS: TestWatchCommand_loadToken/token_arg (0.00s)
    --- PASS: TestWatchCommand_loadToken/token_env (0.00s)
    --- PASS: TestWatchCommand_loadToken/token_file_arg (0.00s)
    --- PASS: TestWatchCommand_loadToken/token_file_env (0.00s)
=== RUN   TestWatchCommandNoConnect
=== PAUSE TestWatchCommandNoConnect
=== RUN   TestWatchCommandNoAgentService
=== PAUSE TestWatchCommandNoAgentService
=== CONT  TestWatchCommand_noTabs
=== CONT  TestWatchCommandNoAgentService
=== CONT  TestWatchCommandNoConnect
=== CONT  TestWatchCommand
--- PASS: TestWatchCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommandNoAgentService - 2019/12/30 19:13:28.599057 [WARN] agent: Node name "Node c5881570-239e-7918-fb0d-0dfcaec4532c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommandNoAgentService - 2019/12/30 19:13:28.609593 [DEBUG] tlsutil: Update with version 1
TestWatchCommandNoAgentService - 2019/12/30 19:13:28.611815 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommandNoConnect - 2019/12/30 19:13:28.645606 [WARN] agent: Node name "Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommandNoConnect - 2019/12/30 19:13:28.646293 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommand - 2019/12/30 19:13:28.648507 [WARN] agent: Node name "Node 8e2856a9-70fd-7067-d82f-21b1511de91b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommand - 2019/12/30 19:13:28.649005 [DEBUG] tlsutil: Update with version 1
TestWatchCommand - 2019/12/30 19:13:28.651583 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestWatchCommandNoConnect - 2019/12/30 19:13:28.651583 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:13:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5 Address:127.0.0.1:19024}]
2019/12/30 19:13:30 [INFO]  raft: Node at 127.0.0.1:19024 [Follower] entering Follower state (Leader: "")
2019/12/30 19:13:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c5881570-239e-7918-fb0d-0dfcaec4532c Address:127.0.0.1:19012}]
2019/12/30 19:13:30 [INFO]  raft: Node at 127.0.0.1:19012 [Follower] entering Follower state (Leader: "")
TestWatchCommandNoConnect - 2019/12/30 19:13:30.448588 [INFO] serf: EventMemberJoin: Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5.dc1 127.0.0.1
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.451637 [INFO] serf: EventMemberJoin: Node c5881570-239e-7918-fb0d-0dfcaec4532c.dc1 127.0.0.1
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.461426 [INFO] serf: EventMemberJoin: Node c5881570-239e-7918-fb0d-0dfcaec4532c 127.0.0.1
TestWatchCommandNoConnect - 2019/12/30 19:13:30.463236 [INFO] serf: EventMemberJoin: Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5 127.0.0.1
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.464321 [INFO] consul: Handled member-join event for server "Node c5881570-239e-7918-fb0d-0dfcaec4532c.dc1" in area "wan"
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.464769 [INFO] consul: Adding LAN server Node c5881570-239e-7918-fb0d-0dfcaec4532c (Addr: tcp/127.0.0.1:19012) (DC: dc1)
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.465070 [INFO] agent: Started DNS server 127.0.0.1:19007 (udp)
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.465421 [INFO] agent: Started DNS server 127.0.0.1:19007 (tcp)
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.467923 [INFO] agent: Started HTTP server on 127.0.0.1:19008 (tcp)
TestWatchCommandNoAgentService - 2019/12/30 19:13:30.468075 [INFO] agent: started state syncer
TestWatchCommandNoConnect - 2019/12/30 19:13:30.469342 [INFO] consul: Adding LAN server Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5 (Addr: tcp/127.0.0.1:19024) (DC: dc1)
TestWatchCommandNoConnect - 2019/12/30 19:13:30.469926 [INFO] consul: Handled member-join event for server "Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5.dc1" in area "wan"
TestWatchCommandNoConnect - 2019/12/30 19:13:30.471559 [INFO] agent: Started DNS server 127.0.0.1:19019 (tcp)
TestWatchCommandNoConnect - 2019/12/30 19:13:30.474806 [INFO] agent: Started DNS server 127.0.0.1:19019 (udp)
TestWatchCommandNoConnect - 2019/12/30 19:13:30.477325 [INFO] agent: Started HTTP server on 127.0.0.1:19020 (tcp)
TestWatchCommandNoConnect - 2019/12/30 19:13:30.477457 [INFO] agent: started state syncer
2019/12/30 19:13:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:30 [INFO]  raft: Node at 127.0.0.1:19024 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:30 [INFO]  raft: Node at 127.0.0.1:19012 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8e2856a9-70fd-7067-d82f-21b1511de91b Address:127.0.0.1:19018}]
TestWatchCommand - 2019/12/30 19:13:30.545331 [INFO] serf: EventMemberJoin: Node 8e2856a9-70fd-7067-d82f-21b1511de91b.dc1 127.0.0.1
2019/12/30 19:13:30 [INFO]  raft: Node at 127.0.0.1:19018 [Follower] entering Follower state (Leader: "")
TestWatchCommand - 2019/12/30 19:13:30.577453 [INFO] serf: EventMemberJoin: Node 8e2856a9-70fd-7067-d82f-21b1511de91b 127.0.0.1
TestWatchCommand - 2019/12/30 19:13:30.578722 [INFO] agent: Started DNS server 127.0.0.1:19013 (udp)
TestWatchCommand - 2019/12/30 19:13:30.581020 [INFO] consul: Adding LAN server Node 8e2856a9-70fd-7067-d82f-21b1511de91b (Addr: tcp/127.0.0.1:19018) (DC: dc1)
TestWatchCommand - 2019/12/30 19:13:30.583431 [INFO] consul: Handled member-join event for server "Node 8e2856a9-70fd-7067-d82f-21b1511de91b.dc1" in area "wan"
TestWatchCommand - 2019/12/30 19:13:30.589128 [INFO] agent: Started DNS server 127.0.0.1:19013 (tcp)
TestWatchCommand - 2019/12/30 19:13:30.593416 [INFO] agent: Started HTTP server on 127.0.0.1:19014 (tcp)
TestWatchCommand - 2019/12/30 19:13:30.593517 [INFO] agent: started state syncer
2019/12/30 19:13:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:30 [INFO]  raft: Node at 127.0.0.1:19018 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:31 [INFO]  raft: Node at 127.0.0.1:19024 [Leader] entering Leader state
2019/12/30 19:13:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:31 [INFO]  raft: Node at 127.0.0.1:19018 [Leader] entering Leader state
TestWatchCommand - 2019/12/30 19:13:31.150351 [INFO] consul: cluster leadership acquired
TestWatchCommandNoConnect - 2019/12/30 19:13:31.150653 [INFO] consul: cluster leadership acquired
TestWatchCommandNoConnect - 2019/12/30 19:13:31.151145 [INFO] consul: New leader elected: Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5
2019/12/30 19:13:31 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:31 [INFO]  raft: Node at 127.0.0.1:19012 [Leader] entering Leader state
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.152071 [INFO] consul: cluster leadership acquired
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.152491 [INFO] consul: New leader elected: Node c5881570-239e-7918-fb0d-0dfcaec4532c
TestWatchCommand - 2019/12/30 19:13:31.153038 [INFO] consul: New leader elected: Node 8e2856a9-70fd-7067-d82f-21b1511de91b
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.462465 [INFO] agent: Requesting shutdown
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.462584 [INFO] consul: shutting down server
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.462631 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.484222 [INFO] agent: Synced node info
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.491613 [DEBUG] agent: Node info in sync
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.491732 [DEBUG] agent: Node info in sync
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.582398 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/12/30 19:13:31.641976 [INFO] agent: Synced node info
TestWatchCommand - 2019/12/30 19:13:31.642100 [DEBUG] agent: Node info in sync
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.642430 [INFO] manager: shutting down
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.732449 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.732706 [INFO] agent: consul server down
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.732756 [INFO] agent: shutdown complete
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.732804 [ERR] consul: failed to establish leadership: raft is already shutdown
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.732815 [INFO] agent: Stopping DNS server 127.0.0.1:19007 (tcp)
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.733018 [INFO] agent: Stopping DNS server 127.0.0.1:19007 (udp)
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.733166 [INFO] agent: Stopping HTTP server 127.0.0.1:19008 (tcp)
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.733363 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommandNoAgentService - 2019/12/30 19:13:31.733401 [INFO] agent: Endpoints down
--- PASS: TestWatchCommandNoAgentService (3.26s)
TestWatchCommandNoConnect - 2019/12/30 19:13:31.850017 [INFO] agent: Synced node info
TestWatchCommandNoConnect - 2019/12/30 19:13:31.850140 [DEBUG] agent: Node info in sync
TestWatchCommand - 2019/12/30 19:13:32.042173 [DEBUG] agent: Node info in sync
TestWatchCommand - 2019/12/30 19:13:32.908274 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommand - 2019/12/30 19:13:32.909006 [DEBUG] consul: Skipping self join check for "Node 8e2856a9-70fd-7067-d82f-21b1511de91b" since the cluster is too small
TestWatchCommand - 2019/12/30 19:13:32.909181 [INFO] consul: member 'Node 8e2856a9-70fd-7067-d82f-21b1511de91b' joined, marking health alive
TestWatchCommandNoConnect - 2019/12/30 19:13:33.125018 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommandNoConnect - 2019/12/30 19:13:33.125716 [DEBUG] consul: Skipping self join check for "Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5" since the cluster is too small
TestWatchCommandNoConnect - 2019/12/30 19:13:33.125898 [INFO] consul: member 'Node 12f32e36-6ac6-81c2-53f0-9a3f66b1ebc5' joined, marking health alive
TestWatchCommand - 2019/12/30 19:13:33.225874 [DEBUG] http: Request GET /v1/agent/self (8.430893ms) from=127.0.0.1:33154
TestWatchCommand - 2019/12/30 19:13:33.246502 [DEBUG] http: Request GET /v1/catalog/nodes (5.970493ms) from=127.0.0.1:33156
TestWatchCommand - 2019/12/30 19:13:33.250507 [INFO] agent: Requesting shutdown
TestWatchCommand - 2019/12/30 19:13:33.250606 [INFO] consul: shutting down server
TestWatchCommand - 2019/12/30 19:13:33.250658 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/12/30 19:13:33.332771 [INFO] agent: Requesting shutdown
TestWatchCommandNoConnect - 2019/12/30 19:13:33.332905 [INFO] consul: shutting down server
TestWatchCommandNoConnect - 2019/12/30 19:13:33.332966 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/12/30 19:13:33.349232 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/12/30 19:13:33.399359 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/12/30 19:13:33.458115 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestWatchCommandNoConnect - 2019/12/30 19:13:33.458205 [DEBUG] agent: Node info in sync
TestWatchCommand - 2019/12/30 19:13:33.701661 [INFO] manager: shutting down
TestWatchCommandNoConnect - 2019/12/30 19:13:33.701670 [INFO] manager: shutting down
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702146 [INFO] agent: consul server down
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702201 [INFO] agent: shutdown complete
TestWatchCommand - 2019/12/30 19:13:33.702237 [INFO] agent: consul server down
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702256 [INFO] agent: Stopping DNS server 127.0.0.1:19019 (tcp)
TestWatchCommand - 2019/12/30 19:13:33.702274 [INFO] agent: shutdown complete
TestWatchCommand - 2019/12/30 19:13:33.702319 [INFO] agent: Stopping DNS server 127.0.0.1:19013 (tcp)
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702375 [INFO] agent: Stopping DNS server 127.0.0.1:19019 (udp)
TestWatchCommand - 2019/12/30 19:13:33.702434 [INFO] agent: Stopping DNS server 127.0.0.1:19013 (udp)
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702528 [INFO] agent: Stopping HTTP server 127.0.0.1:19020 (tcp)
TestWatchCommand - 2019/12/30 19:13:33.702559 [INFO] agent: Stopping HTTP server 127.0.0.1:19014 (tcp)
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702729 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommandNoConnect - 2019/12/30 19:13:33.702808 [INFO] agent: Endpoints down
--- PASS: TestWatchCommandNoConnect (5.23s)
TestWatchCommand - 2019/12/30 19:13:33.703256 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommand - 2019/12/30 19:13:33.703357 [INFO] agent: Endpoints down
--- PASS: TestWatchCommand (5.23s)
PASS
ok  	github.com/hashicorp/consul/command/watch	9.805s
=== RUN   TestStaticResolver_Resolve
=== RUN   TestStaticResolver_Resolve/simples
--- PASS: TestStaticResolver_Resolve (0.00s)
    --- PASS: TestStaticResolver_Resolve/simples (0.00s)
=== RUN   TestConsulResolver_Resolve
WARNING: bootstrap = true: do not enable unless necessary
test-consul - 2019/12/30 19:13:37.154613 [WARN] agent: Node name "Node 6524020a-0f73-cb86-2fce-ad006f4e8207" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test-consul - 2019/12/30 19:13:37.155672 [DEBUG] tlsutil: Update with version 1
test-consul - 2019/12/30 19:13:37.165896 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:13:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6524020a-0f73-cb86-2fce-ad006f4e8207 Address:127.0.0.1:35506}]
2019/12/30 19:13:38 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
test-consul - 2019/12/30 19:13:38.744214 [INFO] serf: EventMemberJoin: Node 6524020a-0f73-cb86-2fce-ad006f4e8207.dc1 127.0.0.1
test-consul - 2019/12/30 19:13:38.751134 [INFO] serf: EventMemberJoin: Node 6524020a-0f73-cb86-2fce-ad006f4e8207 127.0.0.1
test-consul - 2019/12/30 19:13:38.756315 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
test-consul - 2019/12/30 19:13:38.771573 [INFO] consul: Adding LAN server Node 6524020a-0f73-cb86-2fce-ad006f4e8207 (Addr: tcp/127.0.0.1:35506) (DC: dc1)
test-consul - 2019/12/30 19:13:38.772249 [INFO] consul: Handled member-join event for server "Node 6524020a-0f73-cb86-2fce-ad006f4e8207.dc1" in area "wan"
test-consul - 2019/12/30 19:13:38.772593 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
test-consul - 2019/12/30 19:13:38.775228 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
test-consul - 2019/12/30 19:13:38.780365 [INFO] agent: started state syncer
2019/12/30 19:13:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:38 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:39 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:39 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
test-consul - 2019/12/30 19:13:39.216794 [INFO] consul: cluster leadership acquired
test-consul - 2019/12/30 19:13:39.217439 [INFO] consul: New leader elected: Node 6524020a-0f73-cb86-2fce-ad006f4e8207
test-consul - 2019/12/30 19:13:39.517154 [INFO] agent: Synced node info
test-consul - 2019/12/30 19:13:40.317145 [INFO] agent: Synced service "web"
test-consul - 2019/12/30 19:13:40.317267 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:40.317372 [DEBUG] http: Request PUT /v1/agent/service/register (975.032855ms) from=127.0.0.1:58740
test-consul - 2019/12/30 19:13:40.453080 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/30 19:13:40.836206 [ERR] leaf watch error: invalid type for leaf response: <nil>
test-consul - 2019/12/30 19:13:40.886260 [INFO] agent: Synced service "web-proxy"
test-consul - 2019/12/30 19:13:40.886337 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:40.886407 [DEBUG] http: Request PUT /v1/agent/service/register (567.254235ms) from=127.0.0.1:58740
test-consul - 2019/12/30 19:13:41.051241 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/30 19:13:41.051313 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/30 19:13:41.301018 [INFO] agent: Synced service "web-proxy-2"
test-consul - 2019/12/30 19:13:41.301088 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:41.301148 [DEBUG] http: Request PUT /v1/agent/service/register (413.07776ms) from=127.0.0.1:58740
test-consul - 2019/12/30 19:13:41.350068 [ERR] leaf watch error: invalid type for leaf response: <nil>
test-consul - 2019/12/30 19:13:41.451228 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/30 19:13:41.451298 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/30 19:13:41.451336 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/12/30 19:13:41.642373 [INFO] connect: initialized primary datacenter CA with provider "consul"
test-consul - 2019/12/30 19:13:41.642435 [INFO] agent: Synced service "db"
test-consul - 2019/12/30 19:13:41.642477 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:41.642537 [DEBUG] http: Request PUT /v1/agent/service/register (339.924462ms) from=127.0.0.1:58740
test-consul - 2019/12/30 19:13:41.642787 [DEBUG] consul: Skipping self join check for "Node 6524020a-0f73-cb86-2fce-ad006f4e8207" since the cluster is too small
test-consul - 2019/12/30 19:13:41.642973 [INFO] consul: member 'Node 6524020a-0f73-cb86-2fce-ad006f4e8207' joined, marking health alive
test-consul - 2019/12/30 19:13:41.916312 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
test-consul - 2019/12/30 19:13:41.924030 [DEBUG] http: Request POST /v1/query (278.858489ms) from=127.0.0.1:58740
=== RUN   TestConsulResolver_Resolve/basic_service_discovery
test-consul - 2019/12/30 19:13:41.936031 [DEBUG] http: Request GET /v1/health/connect/web?connect=true&passing=1&stale= (6.372171ms) from=127.0.0.1:58740
=== RUN   TestConsulResolver_Resolve/basic_service_with_native_service
test-consul - 2019/12/30 19:13:41.944087 [DEBUG] http: Request GET /v1/health/connect/db?connect=true&passing=1&stale= (1.375036ms) from=127.0.0.1:58740
=== RUN   TestConsulResolver_Resolve/Bad_Type_errors
=== RUN   TestConsulResolver_Resolve/Non-existent_service_errors
test-consul - 2019/12/30 19:13:41.950636 [DEBUG] http: Request GET /v1/health/connect/foo?connect=true&passing=1&stale= (1.343036ms) from=127.0.0.1:58740
=== RUN   TestConsulResolver_Resolve/timeout_errors
=== RUN   TestConsulResolver_Resolve/prepared_query_by_id
test-consul - 2019/12/30 19:13:41.955246 [DEBUG] http: Request GET /v1/query/9d0148eb-f5b3-ed2b-9263-f08bad28befb/execute?connect=true&stale= (1.955386ms) from=127.0.0.1:58740
=== RUN   TestConsulResolver_Resolve/prepared_query_by_name
test-consul - 2019/12/30 19:13:41.961925 [DEBUG] http: Request GET /v1/query/test-query/execute?connect=true&stale= (2.202392ms) from=127.0.0.1:58740
test-consul - 2019/12/30 19:13:41.965363 [INFO] agent: Requesting shutdown
test-consul - 2019/12/30 19:13:41.965472 [INFO] consul: shutting down server
test-consul - 2019/12/30 19:13:41.965517 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/30 19:13:42.044013 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
test-consul - 2019/12/30 19:13:42.044093 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/30 19:13:42.044135 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/30 19:13:42.044168 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/12/30 19:13:42.044201 [DEBUG] agent: Service "db" in sync
test-consul - 2019/12/30 19:13:42.044240 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:42.044308 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/30 19:13:42.044347 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/30 19:13:42.044415 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/12/30 19:13:42.044457 [DEBUG] agent: Service "db" in sync
test-consul - 2019/12/30 19:13:42.044488 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:42.099589 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/30 19:13:42.216038 [INFO] manager: shutting down
test-consul - 2019/12/30 19:13:42.216697 [INFO] agent: consul server down
test-consul - 2019/12/30 19:13:42.216754 [INFO] agent: shutdown complete
test-consul - 2019/12/30 19:13:42.216808 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
test-consul - 2019/12/30 19:13:42.216939 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
test-consul - 2019/12/30 19:13:42.217093 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
test-consul - 2019/12/30 19:13:42.217507 [INFO] agent: Waiting for endpoints to shut down
test-consul - 2019/12/30 19:13:42.217600 [INFO] agent: Endpoints down
--- PASS: TestConsulResolver_Resolve (5.14s)
    --- PASS: TestConsulResolver_Resolve/basic_service_discovery (0.01s)
    --- PASS: TestConsulResolver_Resolve/basic_service_with_native_service (0.01s)
    --- PASS: TestConsulResolver_Resolve/Bad_Type_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/Non-existent_service_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/timeout_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/prepared_query_by_id (0.01s)
    --- PASS: TestConsulResolver_Resolve/prepared_query_by_name (0.01s)
=== RUN   TestConsulResolverFromAddrFunc
=== RUN   TestConsulResolverFromAddrFunc/service
=== RUN   TestConsulResolverFromAddrFunc/query
=== RUN   TestConsulResolverFromAddrFunc/service_with_dc
=== RUN   TestConsulResolverFromAddrFunc/query_with_dc
=== RUN   TestConsulResolverFromAddrFunc/invalid_host:port
=== RUN   TestConsulResolverFromAddrFunc/custom_domain
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter#01
=== RUN   TestConsulResolverFromAddrFunc/unsupported_tag_filter
=== RUN   TestConsulResolverFromAddrFunc/unsupported_tag_filter_with_DC
--- PASS: TestConsulResolverFromAddrFunc (0.01s)
    --- PASS: TestConsulResolverFromAddrFunc/service (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/query (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/service_with_dc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/query_with_dc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/invalid_host:port (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/custom_domain (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter#01 (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_tag_filter (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_tag_filter_with_DC (0.00s)
=== RUN   TestService_Name
--- PASS: TestService_Name (0.06s)
=== RUN   TestService_Dial
--- SKIP: TestService_Dial (0.00s)
    service_test.go:36: DM-skipped
=== RUN   TestService_ServerTLSConfig
--- SKIP: TestService_ServerTLSConfig (0.00s)
    service_test.go:129: DM-skipped
=== RUN   TestService_HTTPClient
2019/12/30 19:13:42 starting test connect HTTPS server on 127.0.0.1:35507
2019/12/30 19:13:42 test connect service listening on 127.0.0.1:35507
2019/12/30 19:13:42 [DEBUG] resolved service instance: 127.0.0.1:35507 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/backend)
2019/12/30 19:13:42 [DEBUG] successfully connected to 127.0.0.1:35507 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/backend)
--- PASS: TestService_HTTPClient (0.22s)
=== RUN   TestService_HasDefaultHTTPResolverFromAddr
--- PASS: TestService_HasDefaultHTTPResolverFromAddr (0.00s)
=== RUN   Test_verifyServerCertMatchesURI
2019/12/30 19:13:42 [ERR] consul.watch: Watch (type: connect_leaf) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/leaf/foo: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 5s
2019/12/30 19:13:42 [ERR] consul.watch: Watch (type: connect_roots) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/roots: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 5s
=== RUN   Test_verifyServerCertMatchesURI/simple_match
=== RUN   Test_verifyServerCertMatchesURI/different_trust-domain_allowed
=== RUN   Test_verifyServerCertMatchesURI/mismatch
=== RUN   Test_verifyServerCertMatchesURI/no_certs
=== RUN   Test_verifyServerCertMatchesURI/nil_certs
--- PASS: Test_verifyServerCertMatchesURI (0.10s)
    --- PASS: Test_verifyServerCertMatchesURI/simple_match (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/different_trust-domain_allowed (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/mismatch (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/no_certs (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/nil_certs (0.00s)
=== RUN   TestClientSideVerifier
=== RUN   TestClientSideVerifier/ok_service_ca1
=== RUN   TestClientSideVerifier/untrusted_CA
=== RUN   TestClientSideVerifier/cross_signed_intermediate
=== RUN   TestClientSideVerifier/cross_signed_without_intermediate
--- PASS: TestClientSideVerifier (0.20s)
    --- PASS: TestClientSideVerifier/ok_service_ca1 (0.01s)
    --- PASS: TestClientSideVerifier/untrusted_CA (0.00s)
    --- PASS: TestClientSideVerifier/cross_signed_intermediate (0.03s)
    --- PASS: TestClientSideVerifier/cross_signed_without_intermediate (0.00s)
=== RUN   TestServerSideVerifier
WARNING: bootstrap = true: do not enable unless necessary
test-consul - 2019/12/30 19:13:42.970155 [WARN] agent: Node name "Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test-consul - 2019/12/30 19:13:42.970912 [DEBUG] tlsutil: Update with version 1
test-consul - 2019/12/30 19:13:42.973549 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:13:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e983d41c-5d5e-fb6b-69f8-fe9108d4f46e Address:127.0.0.1:35513}]
2019/12/30 19:13:43 [INFO]  raft: Node at 127.0.0.1:35513 [Follower] entering Follower state (Leader: "")
test-consul - 2019/12/30 19:13:43.778903 [INFO] serf: EventMemberJoin: Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e.dc1 127.0.0.1
test-consul - 2019/12/30 19:13:43.786262 [INFO] serf: EventMemberJoin: Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e 127.0.0.1
test-consul - 2019/12/30 19:13:43.787354 [INFO] consul: Adding LAN server Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e (Addr: tcp/127.0.0.1:35513) (DC: dc1)
test-consul - 2019/12/30 19:13:43.787583 [INFO] consul: Handled member-join event for server "Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e.dc1" in area "wan"
test-consul - 2019/12/30 19:13:43.788147 [INFO] agent: Started DNS server 127.0.0.1:35508 (tcp)
test-consul - 2019/12/30 19:13:43.788228 [INFO] agent: Started DNS server 127.0.0.1:35508 (udp)
test-consul - 2019/12/30 19:13:43.791130 [INFO] agent: Started HTTP server on 127.0.0.1:35509 (tcp)
test-consul - 2019/12/30 19:13:43.791236 [INFO] agent: started state syncer
2019/12/30 19:13:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:43 [INFO]  raft: Node at 127.0.0.1:35513 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:44 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:44 [INFO]  raft: Node at 127.0.0.1:35513 [Leader] entering Leader state
test-consul - 2019/12/30 19:13:44.258647 [INFO] consul: cluster leadership acquired
test-consul - 2019/12/30 19:13:44.259287 [INFO] consul: New leader elected: Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e
test-consul - 2019/12/30 19:13:44.558572 [INFO] agent: Synced node info
test-consul - 2019/12/30 19:13:44.558701 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:44.633155 [DEBUG] agent: Node info in sync
test-consul - 2019/12/30 19:13:46.575263 [INFO] connect: initialized primary datacenter CA with provider "consul"
test-consul - 2019/12/30 19:13:46.575747 [DEBUG] consul: Skipping self join check for "Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e" since the cluster is too small
test-consul - 2019/12/30 19:13:46.575899 [INFO] consul: member 'Node e983d41c-5d5e-fb6b-69f8-fe9108d4f46e' joined, marking health alive
test-consul - 2019/12/30 19:13:46.708178 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
test-consul - 2019/12/30 19:13:46.984645 [DEBUG] http: Request POST /v1/connect/intentions (257.008234ms) from=127.0.0.1:49558
test-consul - 2019/12/30 19:13:47.293604 [DEBUG] http: Request POST /v1/connect/intentions (304.713515ms) from=127.0.0.1:49558
=== RUN   TestServerSideVerifier/ok_service_ca1,_allow
test-consul - 2019/12/30 19:13:47.463582 [DEBUG] http: Request POST /v1/agent/connect/authorize (2.884744ms) from=127.0.0.1:49558
=== RUN   TestServerSideVerifier/untrusted_CA
2019/12/30 19:13:47 connect: failed TLS verification: x509: certificate signed by unknown authority
=== RUN   TestServerSideVerifier/cross_signed_intermediate,_allow
2019/12/30 19:13:47 [ERR] consul.watch: Watch (type: connect_leaf) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/leaf/foo: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 20s
2019/12/30 19:13:47 [ERR] consul.watch: Watch (type: connect_roots) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/roots: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 20s
test-consul - 2019/12/30 19:13:47.520255 [DEBUG] http: Request POST /v1/agent/connect/authorize (2.194726ms) from=127.0.0.1:49558
=== RUN   TestServerSideVerifier/cross_signed_without_intermediate
2019/12/30 19:13:47 connect: failed TLS verification: x509: certificate signed by unknown authority
=== RUN   TestServerSideVerifier/ok_service_ca1,_deny
test-consul - 2019/12/30 19:13:47.551797 [DEBUG] http: Request POST /v1/agent/connect/authorize (1.346703ms) from=127.0.0.1:49558
2019/12/30 19:13:47 connect: authz call denied: Matched intention: DENY default/* => default/db (ID: 7623f1c8-ac32-ac34-f5d2-1bb1ab126698, Precedence: 8)
=== RUN   TestServerSideVerifier/cross_signed_intermediate,_deny
test-consul - 2019/12/30 19:13:47.639915 [DEBUG] http: Request POST /v1/agent/connect/authorize (2.411398ms) from=127.0.0.1:49558
2019/12/30 19:13:47 connect: authz call denied: Matched intention: DENY default/* => default/db (ID: 7623f1c8-ac32-ac34-f5d2-1bb1ab126698, Precedence: 8)
test-consul - 2019/12/30 19:13:47.642176 [INFO] agent: Requesting shutdown
test-consul - 2019/12/30 19:13:47.642239 [INFO] consul: shutting down server
test-consul - 2019/12/30 19:13:47.642283 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/30 19:13:47.782786 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/30 19:13:47.851015 [INFO] manager: shutting down
test-consul - 2019/12/30 19:13:47.854486 [INFO] agent: consul server down
test-consul - 2019/12/30 19:13:47.854853 [INFO] agent: shutdown complete
test-consul - 2019/12/30 19:13:47.855051 [INFO] agent: Stopping DNS server 127.0.0.1:35508 (tcp)
test-consul - 2019/12/30 19:13:47.855401 [INFO] agent: Stopping DNS server 127.0.0.1:35508 (udp)
test-consul - 2019/12/30 19:13:47.855767 [INFO] agent: Stopping HTTP server 127.0.0.1:35509 (tcp)
test-consul - 2019/12/30 19:13:47.858824 [INFO] agent: Waiting for endpoints to shut down
test-consul - 2019/12/30 19:13:47.858923 [INFO] agent: Endpoints down
--- PASS: TestServerSideVerifier (5.04s)
    --- PASS: TestServerSideVerifier/ok_service_ca1,_allow (0.03s)
    --- PASS: TestServerSideVerifier/untrusted_CA (0.00s)
    --- PASS: TestServerSideVerifier/cross_signed_intermediate,_allow (0.05s)
    --- PASS: TestServerSideVerifier/cross_signed_without_intermediate (0.00s)
    --- PASS: TestServerSideVerifier/ok_service_ca1,_deny (0.03s)
    --- PASS: TestServerSideVerifier/cross_signed_intermediate,_deny (0.09s)
=== RUN   TestDynamicTLSConfig
--- PASS: TestDynamicTLSConfig (0.11s)
=== RUN   TestDynamicTLSConfig_Ready
--- PASS: TestDynamicTLSConfig_Ready (0.10s)
PASS
ok  	github.com/hashicorp/consul/connect	11.289s
?   	github.com/hashicorp/consul/connect/certgen	[no test files]
=== RUN   TestUpstreamResolverFuncFromClient
=== PAUSE TestUpstreamResolverFuncFromClient
=== RUN   TestAgentConfigWatcherManagedProxy
=== PAUSE TestAgentConfigWatcherManagedProxy
=== RUN   TestAgentConfigWatcherSidecarProxy
=== PAUSE TestAgentConfigWatcherSidecarProxy
=== RUN   TestConn
--- SKIP: TestConn (0.00s)
    conn_test.go:67: DM-skipped
=== RUN   TestConnSrcClosing
=== PAUSE TestConnSrcClosing
=== RUN   TestConnDstClosing
=== PAUSE TestConnDstClosing
=== RUN   TestPublicListener
2019/12/30 19:13:46 test tcp server listening on localhost:43002
2019/12/30 19:13:46 [DEBUG] resolved service instance: localhost:43001 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db)
2019/12/30 19:13:46 [DEBUG] successfully connected to localhost:43001 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db)
2019/12/30 19:13:46 connect: nil client
2019/12/30 19:13:46 test tcp echo server 127.0.0.1:43002 stopped
--- PASS: TestPublicListener (0.75s)
=== RUN   TestUpstreamListener
--- SKIP: TestUpstreamListener (0.00s)
    listener_test.go:162: DM-skipped
=== RUN   TestProxy_public
--- SKIP: TestProxy_public (0.00s)
    proxy_test.go:22: DM-skipped
=== CONT  TestUpstreamResolverFuncFromClient
=== RUN   TestUpstreamResolverFuncFromClient/service
=== CONT  TestAgentConfigWatcherSidecarProxy
=== CONT  TestConnDstClosing
=== RUN   TestUpstreamResolverFuncFromClient/prepared_query
=== RUN   TestUpstreamResolverFuncFromClient/unknown_behaves_like_service
--- PASS: TestUpstreamResolverFuncFromClient (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/service (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/prepared_query (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/unknown_behaves_like_service (0.00s)
=== CONT  TestAgentConfigWatcherManagedProxy
--- PASS: TestConnDstClosing (0.01s)
=== CONT  TestConnSrcClosing
--- PASS: TestConnSrcClosing (0.03s)
WARNING: bootstrap = true: do not enable unless necessary
agent_smith - 2019/12/30 19:13:46.394133 [WARN] agent: Node name "Node 4a2b344f-cf41-a697-57d1-9f94152b36cc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
agent_smith - 2019/12/30 19:13:46.403578 [WARN] agent: Node name "Node 860da9d0-66ab-1ca7-2657-f4e07b957409" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
agent_smith - 2019/12/30 19:13:46.412931 [DEBUG] tlsutil: Update with version 1
agent_smith - 2019/12/30 19:13:46.426279 [DEBUG] tlsutil: Update with version 1
agent_smith - 2019/12/30 19:13:46.431822 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
agent_smith - 2019/12/30 19:13:46.442857 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/30 19:13:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4a2b344f-cf41-a697-57d1-9f94152b36cc Address:127.0.0.1:43014}]
2019/12/30 19:13:47 [INFO]  raft: Node at 127.0.0.1:43014 [Follower] entering Follower state (Leader: "")
agent_smith - 2019/12/30 19:13:47.480896 [INFO] serf: EventMemberJoin: Node 4a2b344f-cf41-a697-57d1-9f94152b36cc.dc1 127.0.0.1
agent_smith - 2019/12/30 19:13:47.485659 [INFO] serf: EventMemberJoin: Node 4a2b344f-cf41-a697-57d1-9f94152b36cc 127.0.0.1
agent_smith - 2019/12/30 19:13:47.487015 [INFO] consul: Adding LAN server Node 4a2b344f-cf41-a697-57d1-9f94152b36cc (Addr: tcp/127.0.0.1:43014) (DC: dc1)
agent_smith - 2019/12/30 19:13:47.487511 [INFO] consul: Handled member-join event for server "Node 4a2b344f-cf41-a697-57d1-9f94152b36cc.dc1" in area "wan"
agent_smith - 2019/12/30 19:13:47.488171 [INFO] agent: Started DNS server 127.0.0.1:43009 (udp)
agent_smith - 2019/12/30 19:13:47.488245 [INFO] agent: Started DNS server 127.0.0.1:43009 (tcp)
agent_smith - 2019/12/30 19:13:47.490817 [INFO] agent: Started HTTP server on 127.0.0.1:43010 (tcp)
agent_smith - 2019/12/30 19:13:47.490987 [INFO] agent: started state syncer
2019/12/30 19:13:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:47 [INFO]  raft: Node at 127.0.0.1:43014 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:860da9d0-66ab-1ca7-2657-f4e07b957409 Address:127.0.0.1:43008}]
2019/12/30 19:13:47 [INFO]  raft: Node at 127.0.0.1:43008 [Follower] entering Follower state (Leader: "")
agent_smith - 2019/12/30 19:13:47.570446 [INFO] serf: EventMemberJoin: Node 860da9d0-66ab-1ca7-2657-f4e07b957409.dc1 127.0.0.1
agent_smith - 2019/12/30 19:13:47.573983 [INFO] serf: EventMemberJoin: Node 860da9d0-66ab-1ca7-2657-f4e07b957409 127.0.0.1
agent_smith - 2019/12/30 19:13:47.574764 [INFO] consul: Adding LAN server Node 860da9d0-66ab-1ca7-2657-f4e07b957409 (Addr: tcp/127.0.0.1:43008) (DC: dc1)
agent_smith - 2019/12/30 19:13:47.575165 [INFO] consul: Handled member-join event for server "Node 860da9d0-66ab-1ca7-2657-f4e07b957409.dc1" in area "wan"
agent_smith - 2019/12/30 19:13:47.581108 [INFO] agent: Started DNS server 127.0.0.1:43003 (tcp)
agent_smith - 2019/12/30 19:13:47.581199 [INFO] agent: Started DNS server 127.0.0.1:43003 (udp)
agent_smith - 2019/12/30 19:13:47.583576 [INFO] agent: Started HTTP server on 127.0.0.1:43004 (tcp)
agent_smith - 2019/12/30 19:13:47.583708 [INFO] agent: started state syncer
2019/12/30 19:13:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/30 19:13:47 [INFO]  raft: Node at 127.0.0.1:43008 [Candidate] entering Candidate state in term 2
2019/12/30 19:13:48 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:48 [INFO]  raft: Node at 127.0.0.1:43014 [Leader] entering Leader state
agent_smith - 2019/12/30 19:13:48.109169 [INFO] consul: cluster leadership acquired
agent_smith - 2019/12/30 19:13:48.109967 [INFO] consul: New leader elected: Node 4a2b344f-cf41-a697-57d1-9f94152b36cc
2019/12/30 19:13:48 [INFO]  raft: Election won. Tally: 1
2019/12/30 19:13:48 [INFO]  raft: Node at 127.0.0.1:43008 [Leader] entering Leader state
agent_smith - 2019/12/30 19:13:48.192716 [INFO] consul: cluster leadership acquired
agent_smith - 2019/12/30 19:13:48.193427 [INFO] consul: New leader elected: Node 860da9d0-66ab-1ca7-2657-f4e07b957409
agent_smith - 2019/12/30 19:13:48.651533 [INFO] agent: Synced service "web"
agent_smith - 2019/12/30 19:13:48.651601 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/30 19:13:48.664745 [ERR] leaf watch error: invalid type for leaf response: <nil>
agent_smith - 2019/12/30 19:13:48.811396 [INFO] agent: Synced node info
agent_smith - 2019/12/30 19:13:48.970998 [ERR] leaf watch error: invalid type for leaf response: <nil>
agent_smith - 2019/12/30 19:13:49.054363 [DEBUG] agent: Service "web" in sync
agent_smith - 2019/12/30 19:13:49.658947 [INFO] agent: Synced service "web-proxy"
agent_smith - 2019/12/30 19:13:49.659134 [DEBUG] agent: Check "service:web-proxy" in sync
agent_smith - 2019/12/30 19:13:49.659202 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/30 19:13:49.659326 [DEBUG] http: Request PUT /v1/agent/service/register (1.499439924s) from=127.0.0.1:57584
agent_smith - 2019/12/30 19:13:49.669492 [DEBUG] http: Request GET /v1/agent/service/web-proxy (2.544068ms) from=127.0.0.1:57584
agent_smith - 2019/12/30 19:13:50.036668 [DEBUG] http: Request GET /v1/agent/service/web-proxy?hash=756b7483c4e58024 (361.641376ms) from=127.0.0.1:57584
agent_smith - 2019/12/30 19:13:50.040990 [INFO] agent: Requesting shutdown
agent_smith - 2019/12/30 19:13:50.127286 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/12/30 19:13:50.311360 [INFO] connect: initialized primary datacenter CA with provider "consul"
agent_smith - 2019/12/30 19:13:50.312119 [DEBUG] consul: Skipping self join check for "Node 4a2b344f-cf41-a697-57d1-9f94152b36cc" since the cluster is too small
agent_smith - 2019/12/30 19:13:50.312361 [INFO] consul: member 'Node 4a2b344f-cf41-a697-57d1-9f94152b36cc' joined, marking health alive
agent_smith - 2019/12/30 19:13:50.314253 [INFO] agent: Synced service "web"
agent_smith - 2019/12/30 19:13:50.314355 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/12/30 19:13:50.314493 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/12/30 19:13:50.314534 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/30 19:13:50.314599 [DEBUG] http: Request PUT /v1/agent/service/register (1.988409719s) from=127.0.0.1:35966
agent_smith - 2019/12/30 19:13:50.314367 [INFO] consul: shutting down server
agent_smith - 2019/12/30 19:13:50.314686 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/30 19:13:50.318858 [DEBUG] http: Request GET /v1/agent/service/web-sidecar-proxy (1.757048ms) from=127.0.0.1:35966
agent_smith - 2019/12/30 19:13:50.442439 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/30 19:13:50.526385 [INFO] manager: shutting down
agent_smith - 2019/12/30 19:13:50.526611 [WARN] agent: Syncing service "web-proxy" failed. raft is already shutdown
agent_smith - 2019/12/30 19:13:50.526673 [ERR] agent: failed to sync changes: raft is already shutdown
agent_smith - 2019/12/30 19:13:50.526740 [DEBUG] http: Request PUT /v1/agent/service/register (829.089259ms) from=127.0.0.1:57588
agent_smith - 2019/12/30 19:13:50.526944 [INFO] agent: consul server down
agent_smith - 2019/12/30 19:13:50.526993 [INFO] agent: shutdown complete
agent_smith - 2019/12/30 19:13:50.527047 [INFO] agent: Stopping DNS server 127.0.0.1:43009 (tcp)
agent_smith - 2019/12/30 19:13:50.527184 [INFO] agent: Stopping DNS server 127.0.0.1:43009 (udp)
agent_smith - 2019/12/30 19:13:50.526625 [ERR] consul: failed to reconcile member: {Node 4a2b344f-cf41-a697-57d1-9f94152b36cc 127.0.0.1 43012 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:4a2b344f-cf41-a697-57d1-9f94152b36cc port:43014 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:43013] alive 1 5 2 2 5 4}: leadership lost while committing log
agent_smith - 2019/12/30 19:13:50.527328 [INFO] agent: Stopping HTTP server 127.0.0.1:43010 (tcp)
agent_smith - 2019/12/30 19:13:50.535175 [ERR] leaf watch error: invalid type for leaf response: <nil>
agent_smith - 2019/12/30 19:13:50.734023 [INFO] connect: initialized primary datacenter CA with provider "consul"
agent_smith - 2019/12/30 19:13:50.734541 [DEBUG] consul: Skipping self join check for "Node 860da9d0-66ab-1ca7-2657-f4e07b957409" since the cluster is too small
agent_smith - 2019/12/30 19:13:50.734714 [INFO] consul: member 'Node 860da9d0-66ab-1ca7-2657-f4e07b957409' joined, marking health alive
agent_smith - 2019/12/30 19:13:51.266629 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
agent_smith - 2019/12/30 19:13:51.269292 [INFO] agent: Synced service "web"
agent_smith - 2019/12/30 19:13:51.411051 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/12/30 19:13:51.411144 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/12/30 19:13:51.411191 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/12/30 19:13:51.411223 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/30 19:13:51.411314 [DEBUG] http: Request PUT /v1/agent/service/register (1.062662529s) from=127.0.0.1:35970
agent_smith - 2019/12/30 19:13:51.411948 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
agent_smith - 2019/12/30 19:13:51.412596 [DEBUG] http: Request GET /v1/agent/service/web-sidecar-proxy?hash=4a87c9bd1a9bd791 (1.086507502s) from=127.0.0.1:35966
agent_smith - 2019/12/30 19:13:51.417364 [INFO] agent: Requesting shutdown
agent_smith - 2019/12/30 19:13:51.527775 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:43010 (tcp)
agent_smith - 2019/12/30 19:13:51.527964 [INFO] agent: Waiting for endpoints to shut down
agent_smith - 2019/12/30 19:13:51.528079 [INFO] agent: Endpoints down
--- PASS: TestAgentConfigWatcherManagedProxy (5.32s)
agent_smith - 2019/12/30 19:13:51.567411 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/12/30 19:13:51.567477 [DEBUG] agent: Service "web" in sync
agent_smith - 2019/12/30 19:13:51.567532 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/12/30 19:13:51.567590 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/12/30 19:13:51.567622 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/30 19:13:51.567807 [DEBUG] agent: Service "web" in sync
agent_smith - 2019/12/30 19:13:51.567961 [DEBUG] agent: Service "web-sidecar-proxy" in sync
agent_smith - 2019/12/30 19:13:51.568073 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/12/30 19:13:51.568182 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/12/30 19:13:51.568279 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/30 19:13:51.568682 [INFO] consul: shutting down server
agent_smith - 2019/12/30 19:13:51.568812 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/30 19:13:51.616077 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/30 19:13:51.666205 [INFO] manager: shutting down
agent_smith - 2019/12/30 19:13:51.666742 [INFO] agent: consul server down
agent_smith - 2019/12/30 19:13:51.666798 [INFO] agent: shutdown complete
agent_smith - 2019/12/30 19:13:51.666855 [INFO] agent: Stopping DNS server 127.0.0.1:43003 (tcp)
agent_smith - 2019/12/30 19:13:51.666985 [INFO] agent: Stopping DNS server 127.0.0.1:43003 (udp)
agent_smith - 2019/12/30 19:13:51.667132 [INFO] agent: Stopping HTTP server 127.0.0.1:43004 (tcp)
agent_smith - 2019/12/30 19:13:52.667438 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:43004 (tcp)
agent_smith - 2019/12/30 19:13:52.667508 [INFO] agent: Waiting for endpoints to shut down
agent_smith - 2019/12/30 19:13:52.667545 [INFO] agent: Endpoints down
--- PASS: TestAgentConfigWatcherSidecarProxy (6.46s)
PASS
ok  	github.com/hashicorp/consul/connect/proxy	7.481s
=== RUN   TestIsPrivateIP
=== RUN   TestIsPrivateIP/10.0.0.1
=== RUN   TestIsPrivateIP/100.64.0.1
=== RUN   TestIsPrivateIP/172.16.0.1
=== RUN   TestIsPrivateIP/192.168.0.1
=== RUN   TestIsPrivateIP/192.0.0.1
=== RUN   TestIsPrivateIP/192.0.2.1
=== RUN   TestIsPrivateIP/127.0.0.1
=== RUN   TestIsPrivateIP/169.254.0.1
=== RUN   TestIsPrivateIP/1.2.3.4
=== RUN   TestIsPrivateIP/::1
=== RUN   TestIsPrivateIP/fe80::1
=== RUN   TestIsPrivateIP/fc00::1
=== RUN   TestIsPrivateIP/fec0::1
=== RUN   TestIsPrivateIP/2001:db8::1
=== RUN   TestIsPrivateIP/2004:db6::1
--- PASS: TestIsPrivateIP (0.01s)
    --- PASS: TestIsPrivateIP/10.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/100.64.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/172.16.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.168.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.0.2.1 (0.00s)
    --- PASS: TestIsPrivateIP/127.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/169.254.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/1.2.3.4 (0.00s)
    --- PASS: TestIsPrivateIP/::1 (0.00s)
    --- PASS: TestIsPrivateIP/fe80::1 (0.00s)
    --- PASS: TestIsPrivateIP/fc00::1 (0.00s)
    --- PASS: TestIsPrivateIP/fec0::1 (0.00s)
    --- PASS: TestIsPrivateIP/2001:db8::1 (0.00s)
    --- PASS: TestIsPrivateIP/2004:db6::1 (0.00s)
=== RUN   TestIsIPv6
=== RUN   TestIsIPv6/10.0.0.1
=== RUN   TestIsIPv6/100.64.0.1
=== RUN   TestIsIPv6/172.16.0.1
=== RUN   TestIsIPv6/192.168.0.1
=== RUN   TestIsIPv6/192.0.0.1
=== RUN   TestIsIPv6/192.0.2.1
=== RUN   TestIsIPv6/127.0.0.1
=== RUN   TestIsIPv6/169.254.0.1
=== RUN   TestIsIPv6/1.2.3.4
=== RUN   TestIsIPv6/::1
=== RUN   TestIsIPv6/fe80::1
=== RUN   TestIsIPv6/fc00::1
=== RUN   TestIsIPv6/fec0::1
=== RUN   TestIsIPv6/2001:db8::1
=== RUN   TestIsIPv6/2004:db6::1
=== RUN   TestIsIPv6/example.com
=== RUN   TestIsIPv6/localhost
=== RUN   TestIsIPv6/1.257.0.1
--- PASS: TestIsIPv6 (0.00s)
    --- PASS: TestIsIPv6/10.0.0.1 (0.00s)
    --- PASS: TestIsIPv6/100.64.0.1 (0.00s)
    --- PASS: TestIsIPv6/172.16.0.1 (0.00s)
    --- PASS: TestIsIPv6/192.168.0.1 (0.00s)
    --- PASS: TestIsIPv6/192.0.0.1 (0.00s)
    --- PASS: TestIsIPv6/192.0.2.1 (0.00s)
    --- PASS: TestIsIPv6/127.0.0.1 (0.00s)
    --- PASS: TestIsIPv6/169.254.0.1 (0.00s)
    --- PASS: TestIsIPv6/1.2.3.4 (0.00s)
    --- PASS: TestIsIPv6/::1 (0.00s)
    --- PASS: TestIsIPv6/fe80::1 (0.00s)
    --- PASS: TestIsIPv6/fc00::1 (0.00s)
    --- PASS: TestIsIPv6/fec0::1 (0.00s)
    --- PASS: TestIsIPv6/2001:db8::1 (0.00s)
    --- PASS: TestIsIPv6/2004:db6::1 (0.00s)
    --- PASS: TestIsIPv6/example.com (0.00s)
    --- PASS: TestIsIPv6/localhost (0.00s)
    --- PASS: TestIsIPv6/1.257.0.1 (0.00s)
PASS
ok  	github.com/hashicorp/consul/ipaddr	0.058s
=== RUN   TestDurationMinusBuffer
--- PASS: TestDurationMinusBuffer (0.00s)
=== RUN   TestDurationMinusBufferDomain
--- PASS: TestDurationMinusBufferDomain (0.00s)
=== RUN   TestRandomStagger
--- PASS: TestRandomStagger (0.00s)
=== RUN   TestRateScaledInterval
--- PASS: TestRateScaledInterval (0.00s)
=== RUN   TestMapWalk
--- SKIP: TestMapWalk (0.00s)
    map_walker_test.go:10: DM-skipped
=== RUN   TestJitterRandomStagger
=== PAUSE TestJitterRandomStagger
=== RUN   TestRetryWaiter_calculateWait
=== PAUSE TestRetryWaiter_calculateWait
=== RUN   TestRetryWaiter_WaitChans
=== PAUSE TestRetryWaiter_WaitChans
=== RUN   TestRTT_ComputeDistance
=== RUN   TestRTT_ComputeDistance/10_ms
=== RUN   TestRTT_ComputeDistance/0_ms
=== RUN   TestRTT_ComputeDistance/2_ms
=== RUN   TestRTT_ComputeDistance/2_ms_reversed
=== RUN   TestRTT_ComputeDistance/a_nil
=== RUN   TestRTT_ComputeDistance/b_nil
=== RUN   TestRTT_ComputeDistance/both_nil
--- PASS: TestRTT_ComputeDistance (0.00s)
    --- PASS: TestRTT_ComputeDistance/10_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/0_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/2_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/2_ms_reversed (0.00s)
    --- PASS: TestRTT_ComputeDistance/a_nil (0.00s)
    --- PASS: TestRTT_ComputeDistance/b_nil (0.00s)
    --- PASS: TestRTT_ComputeDistance/both_nil (0.00s)
=== RUN   TestRTT_Intersect
=== RUN   TestRTT_Intersect/nil_maps
=== RUN   TestRTT_Intersect/two_servers
=== RUN   TestRTT_Intersect/two_clients
=== RUN   TestRTT_Intersect/server1_and_client_alpha
=== RUN   TestRTT_Intersect/server1_and_client_beta_1
=== RUN   TestRTT_Intersect/server1_and_client_alpha_reversed
=== RUN   TestRTT_Intersect/server1_and_client_beta_1_reversed
=== RUN   TestRTT_Intersect/nothing_in_common
=== RUN   TestRTT_Intersect/nothing_in_common_reversed
--- PASS: TestRTT_Intersect (0.00s)
    --- PASS: TestRTT_Intersect/nil_maps (0.00s)
    --- PASS: TestRTT_Intersect/two_servers (0.00s)
    --- PASS: TestRTT_Intersect/two_clients (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_alpha (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_beta_1 (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_alpha_reversed (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_beta_1_reversed (0.00s)
    --- PASS: TestRTT_Intersect/nothing_in_common (0.00s)
    --- PASS: TestRTT_Intersect/nothing_in_common_reversed (0.00s)
=== RUN   TestStrContains
--- PASS: TestStrContains (0.00s)
=== RUN   TestTelemetryConfig_MergeDefaults
=== RUN   TestTelemetryConfig_MergeDefaults/basic_merge
=== RUN   TestTelemetryConfig_MergeDefaults/exhaustive
--- PASS: TestTelemetryConfig_MergeDefaults (0.00s)
    --- PASS: TestTelemetryConfig_MergeDefaults/basic_merge (0.00s)
    --- PASS: TestTelemetryConfig_MergeDefaults/exhaustive (0.00s)
=== RUN   TestTranslateKeys
=== RUN   TestTranslateKeys/x->y
=== RUN   TestTranslateKeys/discard_x
=== RUN   TestTranslateKeys/b.x->b.y
=== RUN   TestTranslateKeys/json:_x->y
=== RUN   TestTranslateKeys/json:_X->y
=== RUN   TestTranslateKeys/json:_discard_x
=== RUN   TestTranslateKeys/json:_b.x->b.y
=== RUN   TestTranslateKeys/json:_b[0].x->b[0].y
--- PASS: TestTranslateKeys (0.00s)
    --- PASS: TestTranslateKeys/x->y (0.00s)
    --- PASS: TestTranslateKeys/discard_x (0.00s)
    --- PASS: TestTranslateKeys/b.x->b.y (0.00s)
    --- PASS: TestTranslateKeys/json:_x->y (0.00s)
    --- PASS: TestTranslateKeys/json:_X->y (0.00s)
    --- PASS: TestTranslateKeys/json:_discard_x (0.00s)
    --- PASS: TestTranslateKeys/json:_b.x->b.y (0.00s)
    --- PASS: TestTranslateKeys/json:_b[0].x->b[0].y (0.00s)
=== RUN   TestUserAgent
--- PASS: TestUserAgent (0.00s)
=== RUN   TestMathAbsInt
--- PASS: TestMathAbsInt (0.00s)
=== RUN   TestMathMaxInt
--- PASS: TestMathMaxInt (0.00s)
=== RUN   TestMathMinInt
--- PASS: TestMathMinInt (0.00s)
=== CONT  TestJitterRandomStagger
=== RUN   TestJitterRandomStagger/0_percent
=== PAUSE TestJitterRandomStagger/0_percent
=== RUN   TestJitterRandomStagger/10_percent
=== PAUSE TestJitterRandomStagger/10_percent
=== RUN   TestJitterRandomStagger/100_percent
=== PAUSE TestJitterRandomStagger/100_percent
=== CONT  TestJitterRandomStagger/0_percent
=== CONT  TestRetryWaiter_calculateWait
=== RUN   TestRetryWaiter_calculateWait/Defaults
=== PAUSE TestRetryWaiter_calculateWait/Defaults
=== RUN   TestRetryWaiter_calculateWait/Minimum_Wait
=== PAUSE TestRetryWaiter_calculateWait/Minimum_Wait
=== RUN   TestRetryWaiter_calculateWait/Minimum_Failures
=== PAUSE TestRetryWaiter_calculateWait/Minimum_Failures
=== RUN   TestRetryWaiter_calculateWait/Maximum_Wait
=== PAUSE TestRetryWaiter_calculateWait/Maximum_Wait
=== CONT  TestRetryWaiter_calculateWait/Defaults
=== CONT  TestRetryWaiter_WaitChans
=== RUN   TestRetryWaiter_WaitChans/Minimum_Wait_-_Success
=== PAUSE TestRetryWaiter_WaitChans/Minimum_Wait_-_Success
=== RUN   TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf
=== PAUSE TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf
=== RUN   TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr
=== PAUSE TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr
=== RUN   TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed
=== PAUSE TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed
=== RUN   TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf
=== PAUSE TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf
=== RUN   TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr
=== PAUSE TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr
=== CONT  TestRetryWaiter_WaitChans/Minimum_Wait_-_Success
=== CONT  TestRetryWaiter_calculateWait/Minimum_Wait
=== CONT  TestJitterRandomStagger/10_percent
=== CONT  TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed
=== CONT  TestJitterRandomStagger/100_percent
--- PASS: TestJitterRandomStagger (0.00s)
    --- PASS: TestJitterRandomStagger/0_percent (0.00s)
    --- PASS: TestJitterRandomStagger/10_percent (0.00s)
    --- PASS: TestJitterRandomStagger/100_percent (0.00s)
=== CONT  TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr
=== CONT  TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf
=== CONT  TestRetryWaiter_calculateWait/Maximum_Wait
=== CONT  TestRetryWaiter_calculateWait/Minimum_Failures
=== CONT  TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr
--- PASS: TestRetryWaiter_calculateWait (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Defaults (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Minimum_Wait (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Maximum_Wait (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Minimum_Failures (0.00s)
=== CONT  TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf
--- PASS: TestRetryWaiter_WaitChans (0.00s)
    --- PASS: TestRetryWaiter_WaitChans/Minimum_Wait_-_Success (0.20s)
    --- PASS: TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed (0.25s)
    --- PASS: TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr (0.25s)
    --- PASS: TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf (0.25s)
    --- PASS: TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr (0.20s)
    --- PASS: TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf (0.20s)
PASS
ok  	github.com/hashicorp/consul/lib	0.542s
=== RUN   TestWriteAtomic
--- PASS: TestWriteAtomic (0.25s)
PASS
ok  	github.com/hashicorp/consul/lib/file	1.031s
=== RUN   TestDynamic
=== PAUSE TestDynamic
=== RUN   TestDynamicPanic
=== PAUSE TestDynamicPanic
=== RUN   TestDynamicAcquire
=== PAUSE TestDynamicAcquire
=== CONT  TestDynamic
=== CONT  TestDynamicAcquire
=== CONT  TestDynamicPanic
--- PASS: TestDynamicPanic (0.00s)
--- PASS: TestDynamicAcquire (0.05s)
--- PASS: TestDynamic (1.83s)
PASS
ok  	github.com/hashicorp/consul/lib/semaphore	1.885s
=== RUN   TestGatedWriter_impl
--- PASS: TestGatedWriter_impl (0.00s)
=== RUN   TestGatedWriter
--- PASS: TestGatedWriter (0.00s)
=== RUN   TestGRPCLogger
--- PASS: TestGRPCLogger (0.00s)
=== RUN   TestGRPCLogger_V
=== RUN   TestGRPCLogger_V/ERR,-1
=== RUN   TestGRPCLogger_V/ERR,0
=== RUN   TestGRPCLogger_V/ERR,1
=== RUN   TestGRPCLogger_V/ERR,2
=== RUN   TestGRPCLogger_V/ERR,3
=== RUN   TestGRPCLogger_V/WARN,-1
=== RUN   TestGRPCLogger_V/WARN,0
=== RUN   TestGRPCLogger_V/WARN,1
=== RUN   TestGRPCLogger_V/WARN,2
=== RUN   TestGRPCLogger_V/WARN,3
=== RUN   TestGRPCLogger_V/INFO,-1
=== RUN   TestGRPCLogger_V/INFO,0
=== RUN   TestGRPCLogger_V/INFO,1
=== RUN   TestGRPCLogger_V/INFO,2
=== RUN   TestGRPCLogger_V/INFO,3
=== RUN   TestGRPCLogger_V/DEBUG,-1
=== RUN   TestGRPCLogger_V/DEBUG,0
=== RUN   TestGRPCLogger_V/DEBUG,1
=== RUN   TestGRPCLogger_V/DEBUG,2
=== RUN   TestGRPCLogger_V/DEBUG,3
=== RUN   TestGRPCLogger_V/TRACE,-1
=== RUN   TestGRPCLogger_V/TRACE,0
=== RUN   TestGRPCLogger_V/TRACE,1
=== RUN   TestGRPCLogger_V/TRACE,2
=== RUN   TestGRPCLogger_V/TRACE,3
--- PASS: TestGRPCLogger_V (0.01s)
    --- PASS: TestGRPCLogger_V/ERR,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,0 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,1 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,2 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,3 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,0 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,1 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,2 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,3 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,0 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,1 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,2 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,3 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,0 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,1 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,2 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,3 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,0 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,1 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,2 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,3 (0.00s)
=== RUN   TestLogWriter
--- PASS: TestLogWriter (0.00s)
=== RUN   TestLogFile_timeRotation
=== PAUSE TestLogFile_timeRotation
=== RUN   TestLogFile_openNew
=== PAUSE TestLogFile_openNew
=== RUN   TestLogFile_byteRotation
=== PAUSE TestLogFile_byteRotation
=== RUN   TestLogFile_logLevelFiltering
=== PAUSE TestLogFile_logLevelFiltering
=== CONT  TestLogFile_timeRotation
=== CONT  TestLogFile_logLevelFiltering
=== CONT  TestLogFile_byteRotation
--- PASS: TestLogFile_logLevelFiltering (0.00s)
=== CONT  TestLogFile_openNew
--- PASS: TestLogFile_byteRotation (0.00s)
--- PASS: TestLogFile_openNew (0.00s)
--- PASS: TestLogFile_timeRotation (2.00s)
PASS
ok  	github.com/hashicorp/consul/logger	2.053s
?   	github.com/hashicorp/consul/sdk/freeport	[no test files]
?   	github.com/hashicorp/consul/sdk/testutil	[no test files]
=== RUN   TestRetryer
=== RUN   TestRetryer/counter
=== RUN   TestRetryer/timer
--- PASS: TestRetryer (0.40s)
    --- PASS: TestRetryer/counter (0.20s)
    --- PASS: TestRetryer/timer (0.20s)
PASS
ok  	github.com/hashicorp/consul/sdk/testutil/retry	0.422s
?   	github.com/hashicorp/consul/sentinel	[no test files]
?   	github.com/hashicorp/consul/service_os	[no test files]
=== RUN   TestArchive
--- PASS: TestArchive (0.00s)
=== RUN   TestArchive_GoodData
--- PASS: TestArchive_GoodData (0.10s)
=== RUN   TestArchive_BadData
--- PASS: TestArchive_BadData (0.01s)
=== RUN   TestArchive_hashList
--- PASS: TestArchive_hashList (0.00s)
=== RUN   TestSnapshot
2019-12-30T19:13:51.083Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-9dcc7524-c203-3ae4-2b5d-07c40b5fe30e Address:9dcc7524-c203-3ae4-2b5d-07c40b5fe30e}]
2019-12-30T19:13:51.084Z [INFO]  raft: Node at 9dcc7524-c203-3ae4-2b5d-07c40b5fe30e [Follower] entering Follower state (Leader: "")
2019-12-30T19:13:53.019Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-30T19:13:53.020Z [INFO]  raft: Node at 9dcc7524-c203-3ae4-2b5d-07c40b5fe30e [Candidate] entering Candidate state in term 2
2019-12-30T19:13:53.020Z [DEBUG] raft: Votes needed: 1
2019-12-30T19:13:53.020Z [DEBUG] raft: Vote granted from server-9dcc7524-c203-3ae4-2b5d-07c40b5fe30e in term 2. Tally: 1
2019-12-30T19:13:53.020Z [INFO]  raft: Election won. Tally: 1
2019-12-30T19:13:53.020Z [INFO]  raft: Node at 9dcc7524-c203-3ae4-2b5d-07c40b5fe30e [Leader] entering Leader state
2019-12-30T19:14:04.645Z [INFO]  raft: Starting snapshot up to 65538
2019/12/30 19:14:04 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot-snapshot462227593/before/snapshots/2-65538-1577733244645.tmp
2019-12-30T19:14:05.917Z [INFO]  raft: Compacting logs from 1 to 55298
2019-12-30T19:14:05.949Z [INFO]  raft: Snapshot to 65538 complete
2019-12-30T19:14:18.452Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-0fd9b00a-ae43-747a-c1e0-fd085aedb425 Address:0fd9b00a-ae43-747a-c1e0-fd085aedb425}]
2019-12-30T19:14:18.453Z [INFO]  raft: Node at 0fd9b00a-ae43-747a-c1e0-fd085aedb425 [Follower] entering Follower state (Leader: "")
2019-12-30T19:14:20.314Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-30T19:14:20.314Z [INFO]  raft: Node at 0fd9b00a-ae43-747a-c1e0-fd085aedb425 [Candidate] entering Candidate state in term 2
2019-12-30T19:14:20.314Z [DEBUG] raft: Votes needed: 1
2019-12-30T19:14:20.314Z [DEBUG] raft: Vote granted from server-0fd9b00a-ae43-747a-c1e0-fd085aedb425 in term 2. Tally: 1
2019-12-30T19:14:20.314Z [INFO]  raft: Election won. Tally: 1
2019-12-30T19:14:20.315Z [INFO]  raft: Node at 0fd9b00a-ae43-747a-c1e0-fd085aedb425 [Leader] entering Leader state
2019/12/30 19:14:22 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot-snapshot462227593/after/snapshots/2-65539-1577733262700.tmp
2019-12-30T19:14:23.651Z [INFO]  raft: Copied 16973829 bytes to local snapshot
2019-12-30T19:14:24.360Z [INFO]  raft: Restored user snapshot (index 65539)
--- PASS: TestSnapshot (33.43s)
=== RUN   TestSnapshot_Nil
--- PASS: TestSnapshot_Nil (0.00s)
=== RUN   TestSnapshot_BadVerify
--- PASS: TestSnapshot_BadVerify (0.00s)
=== RUN   TestSnapshot_BadRestore
2019-12-30T19:14:24.514Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-dfb96550-ed93-ef4d-e8b7-038caf38b87e Address:dfb96550-ed93-ef4d-e8b7-038caf38b87e}]
2019-12-30T19:14:24.514Z [INFO]  raft: Node at dfb96550-ed93-ef4d-e8b7-038caf38b87e [Follower] entering Follower state (Leader: "")
2019-12-30T19:14:26.385Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-30T19:14:26.385Z [INFO]  raft: Node at dfb96550-ed93-ef4d-e8b7-038caf38b87e [Candidate] entering Candidate state in term 2
2019-12-30T19:14:26.385Z [DEBUG] raft: Votes needed: 1
2019-12-30T19:14:26.385Z [DEBUG] raft: Vote granted from server-dfb96550-ed93-ef4d-e8b7-038caf38b87e in term 2. Tally: 1
2019-12-30T19:14:26.385Z [INFO]  raft: Election won. Tally: 1
2019-12-30T19:14:26.386Z [INFO]  raft: Node at dfb96550-ed93-ef4d-e8b7-038caf38b87e [Leader] entering Leader state
2019-12-30T19:14:29.228Z [INFO]  raft: Starting snapshot up to 16386
2019/12/30 19:14:29 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot_BadRestore-snapshot017197734/before/snapshots/2-16386-1577733269229.tmp
2019-12-30T19:14:29.717Z [INFO]  raft: Compacting logs from 1 to 6146
2019-12-30T19:14:29.720Z [INFO]  raft: Snapshot to 16386 complete
2019-12-30T19:14:32.630Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-454fa51d-ce3a-6ea3-108c-08ac29340bb8 Address:454fa51d-ce3a-6ea3-108c-08ac29340bb8}]
2019-12-30T19:14:32.630Z [INFO]  raft: Node at 454fa51d-ce3a-6ea3-108c-08ac29340bb8 [Follower] entering Follower state (Leader: "")
2019-12-30T19:14:33.776Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-30T19:14:33.777Z [INFO]  raft: Node at 454fa51d-ce3a-6ea3-108c-08ac29340bb8 [Candidate] entering Candidate state in term 2
2019-12-30T19:14:33.777Z [DEBUG] raft: Votes needed: 1
2019-12-30T19:14:33.777Z [DEBUG] raft: Vote granted from server-454fa51d-ce3a-6ea3-108c-08ac29340bb8 in term 2. Tally: 1
2019-12-30T19:14:33.777Z [INFO]  raft: Election won. Tally: 1
2019-12-30T19:14:33.777Z [INFO]  raft: Node at 454fa51d-ce3a-6ea3-108c-08ac29340bb8 [Leader] entering Leader state
[ERR] snapshot: Failed to close snapshot decompressor: unexpected EOF
--- PASS: TestSnapshot_BadRestore (9.28s)
PASS
ok  	github.com/hashicorp/consul/snapshot	42.923s
?   	github.com/hashicorp/consul/testrpc	[no test files]
=== RUN   TestConfigurator_outgoingWrapper_OK
--- PASS: TestConfigurator_outgoingWrapper_OK (0.16s)
=== RUN   TestConfigurator_outgoingWrapper_noverify_OK
--- PASS: TestConfigurator_outgoingWrapper_noverify_OK (0.15s)
=== RUN   TestConfigurator_outgoingWrapper_BadDC
--- PASS: TestConfigurator_outgoingWrapper_BadDC (0.14s)
=== RUN   TestConfigurator_outgoingWrapper_BadCert
--- PASS: TestConfigurator_outgoingWrapper_BadCert (0.14s)
=== RUN   TestConfigurator_wrapTLS_OK
--- PASS: TestConfigurator_wrapTLS_OK (0.15s)
=== RUN   TestConfigurator_wrapTLS_BadCert
--- PASS: TestConfigurator_wrapTLS_BadCert (0.15s)
=== RUN   TestConfig_ParseCiphers
--- PASS: TestConfig_ParseCiphers (0.00s)
=== RUN   TestConfigurator_loadKeyPair
--- PASS: TestConfigurator_loadKeyPair (0.01s)
=== RUN   TestConfig_SpecifyDC
--- PASS: TestConfig_SpecifyDC (0.00s)
=== RUN   TestConfigurator_NewConfigurator
--- PASS: TestConfigurator_NewConfigurator (0.00s)
=== RUN   TestConfigurator_ErrorPropagation
--- PASS: TestConfigurator_ErrorPropagation (0.06s)
=== RUN   TestConfigurator_CommonTLSConfigServerNameNodeName
--- PASS: TestConfigurator_CommonTLSConfigServerNameNodeName (0.00s)
=== RUN   TestConfigurator_loadCAs
--- PASS: TestConfigurator_loadCAs (0.03s)
=== RUN   TestConfigurator_CommonTLSConfigInsecureSkipVerify
--- PASS: TestConfigurator_CommonTLSConfigInsecureSkipVerify (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigPreferServerCipherSuites
--- PASS: TestConfigurator_CommonTLSConfigPreferServerCipherSuites (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigCipherSuites
--- PASS: TestConfigurator_CommonTLSConfigCipherSuites (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigGetClientCertificate
--- PASS: TestConfigurator_CommonTLSConfigGetClientCertificate (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigCAs
--- PASS: TestConfigurator_CommonTLSConfigCAs (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigTLSMinVersion
--- PASS: TestConfigurator_CommonTLSConfigTLSMinVersion (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigVerifyIncoming
--- PASS: TestConfigurator_CommonTLSConfigVerifyIncoming (0.00s)
=== RUN   TestConfigurator_OutgoingRPCTLSDisabled
--- PASS: TestConfigurator_OutgoingRPCTLSDisabled (0.00s)
=== RUN   TestConfigurator_VerifyIncomingRPC
--- PASS: TestConfigurator_VerifyIncomingRPC (0.00s)
=== RUN   TestConfigurator_VerifyIncomingHTTPS
--- PASS: TestConfigurator_VerifyIncomingHTTPS (0.00s)
=== RUN   TestConfigurator_EnableAgentTLSForChecks
--- PASS: TestConfigurator_EnableAgentTLSForChecks (0.00s)
=== RUN   TestConfigurator_IncomingRPCConfig
--- PASS: TestConfigurator_IncomingRPCConfig (0.00s)
=== RUN   TestConfigurator_IncomingHTTPSConfig
--- PASS: TestConfigurator_IncomingHTTPSConfig (0.00s)
=== RUN   TestConfigurator_OutgoingTLSConfigForChecks
--- PASS: TestConfigurator_OutgoingTLSConfigForChecks (0.00s)
=== RUN   TestConfigurator_OutgoingRPCConfig
--- PASS: TestConfigurator_OutgoingRPCConfig (0.00s)
=== RUN   TestConfigurator_OutgoingRPCWrapper
--- PASS: TestConfigurator_OutgoingRPCWrapper (0.00s)
    config_test.go:699: TODO: actually call wrap here eventually
=== RUN   TestConfigurator_UpdateChecks
--- PASS: TestConfigurator_UpdateChecks (0.00s)
=== RUN   TestConfigurator_UpdateSetsStuff
--- PASS: TestConfigurator_UpdateSetsStuff (0.00s)
=== RUN   TestConfigurator_ServerNameOrNodeName
--- PASS: TestConfigurator_ServerNameOrNodeName (0.00s)
=== RUN   TestConfigurator_VerifyOutgoing
--- PASS: TestConfigurator_VerifyOutgoing (0.00s)
=== RUN   TestConfigurator_Domain
--- PASS: TestConfigurator_Domain (0.00s)
=== RUN   TestConfigurator_VerifyServerHostname
--- PASS: TestConfigurator_VerifyServerHostname (0.00s)
=== RUN   TestConfigurator_AutoEncrytCertExpired
--- PASS: TestConfigurator_AutoEncrytCertExpired (0.02s)
=== RUN   TestSerialNumber
--- PASS: TestSerialNumber (0.00s)
=== RUN   TestGeneratePrivateKey
=== PAUSE TestGeneratePrivateKey
=== RUN   TestGenerateCA
=== PAUSE TestGenerateCA
=== RUN   TestGenerateCert
--- SKIP: TestGenerateCert (0.00s)
    generate_test.go:102: DM-skipped
=== CONT  TestGeneratePrivateKey
=== CONT  TestGenerateCA
--- PASS: TestGeneratePrivateKey (0.01s)
--- PASS: TestGenerateCA (0.01s)
PASS
ok  	github.com/hashicorp/consul/tlsutil	1.086s
?   	github.com/hashicorp/consul/types	[no test files]
?   	github.com/hashicorp/consul/version	[no test files]
FAIL
dh_auto_test: cd _build && go test -vet=off -v -p 4 -short -failfast -timeout 7m github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version returned exit code 1
make[1]: *** [debian/rules:51: override_dh_auto_test] Error 255
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
make: *** [debian/rules:13: build-arch] Error 2
dpkg-buildpackage: error: debian/rules build-arch subprocess returned exit status 2
--------------------------------------------------------------------------------
Build finished at 2019-12-30T19:14:39Z

Finished
--------


+------------------------------------------------------------------------------+
| Cleanup                                                                      |
+------------------------------------------------------------------------------+

Purging /<<BUILDDIR>>
Not cleaning session: cloned chroot in use
E: Build failure (dpkg-buildpackage died)

+------------------------------------------------------------------------------+
| Summary                                                                      |
+------------------------------------------------------------------------------+

Build Architecture: armhf
Build-Space: 0
Build-Time: 2490
Distribution: bullseye-staging
Fail-Stage: build
Host Architecture: armhf
Install-Time: 1976
Job: consul_1.5.2+dfsg1-6
Machine Architecture: armhf
Package: consul
Package-Time: 4525
Source-Version: 1.5.2+dfsg1-6
Space: 0
Status: failed
Version: 1.5.2+dfsg1-6
--------------------------------------------------------------------------------
Finished at 2019-12-30T19:14:39Z
Build needed 00:00:00, 0k disc space